Show HN: Built a portable memory layer that works across AI tools

5 hours ago

Hey HN, I've been frustrated that every time I switch AI tools (ChatGPT, Claude, Gemini, etc.), I lose all context about my preferences, projects, and past conversations. So I built a portable memory system that maintains context across different AI platforms. This became the #1 product of the day on product hunt How it works: A browser extension that lets you store highlights, conversations, file uploads etc. and then adds inline context to any of your query on any agent

Current state: - Works with all major AI platforms (ChatGPT, Claude, Grok, Gemini, Perplexity etc.). - Can create n different contexts and select which one you want to use when - implementing TEEs on the backend for privacy

What I'm unsure about: I'm trying to figure out if this is actually valuable enough for people to pay for, or if it's just solving my own niche problem. Questions for this community:

Do you find context-switching between AI tools painful enough to pay to solve it? What would make you trust a third-party service with your AI conversation context? Is this something you'd expect to be free, or does $12/month seem reasonable?

Would love honest feedback - especially if you think this is a terrible idea or I'm missing something obvious.

PH Launch: https://www.producthunt.com/products/ai-context-flow

AI Context Flow extension: https://chromewebstore.google.com/detail/ai-context-flow-imp...

It makes sense to me personally I’ve been using multiple agents on a day-to-day basis to compare responses and generate content, among other things. I’ve also used this extension quite a lot in the past. Since the extension is currently free, I can create as many memory buckets as I want and make unlimited AI queries. I think I might eventually pay to use my memory or create new ones.

Full disclosure: I’m also part of the team that built this extension, and our entire team uses it to test functionality. So technically, we’re our own first customers.

That is an interesting concept, my company is paying for both ChatGPT and Claude, so I don't really mind paying 12 a month by myself or let the company cover this part to bridge these tools, it will actually help me so that I dont need to upload the docs everytime and update them manually.

I do would worry about the security tho, does it store all of my conversations with all the chatbots I use?

This is actually one of the few 'memory layer' approaches that feels grounded in how we - the users - actually switch between models. We want continuity across LLMs instead of being restricted to one.

The monetization question really comes down to trust: if it feels like their memory (private, portable, revocable), the value prop is obvious.

$12/month seems fair when I factor in how much context rebuilding time it saves me each week.