← Back to context

Comment by candu

4 days ago

> but people have hundreds and thousands on conversation on these apps that can't be easily moved elsewhere.

Except these aren't conversations in the traditional sense. Yes, there's the history of prompts and responses exchanged. But the threads don't build on each other - there's no cross-conversational memory, such as you'd have in a human relationship. Even within a conversation it's mostly stateless, sending the full context history each time as input.

So there's no real data or network effect moat - the moat is all in model quality (which is an extremely competitive race) and harness quality (same). I just don't think there's any real switching cost here.

This is not the case.

I use OpenAI a lot on the paid plan via the UI. It now knows absolutely loads about me and seems to have a massive amount of cross conversational memory. It's really getting very close to what you'd expect from a human conversation in this regard.

Sure the model itself is still stateless, and if you use the API then what you say is true.

But they are doing so much unseen summarisation and longer context building behind the scenes in the webapp, what you see in the current conversation history is just a fraction of what is getting sent to the model.

  • > It now knows absolutely loads about me

    Baffled that someone tech literate would be boasting about this in the year 2026. I mean, you do you, we all have different priorities and threat vectors, but this is the furthest from what I would personally want.

    • It's not boasting, I'm not sure why what I wrote would come across that way. I'm describing how I use a product and the functionality it presents to me.

      But yes, it's an emerging area and I am questioning if I am sharing too much with it. I 100% would not want my chat histories exposed.

      Saying that though, facebook can read my highly personal messages, google every email, my phone is tracking my every move, I have to sign up for random janky websites for my kids school where ther medical info is stored, etc.

      LLM chat history presents a new risk and a different set of data, but it's a crowded minefield already.

    • This is the same as when Google got big (and Facebook, etc...). We have some privacy focused competitors (Kagi, etc...) but most people are quite happy to just give Google (and worse, Facebook) everything.

      AI is just a new technology but this has been ramping up for decades now.

I see people who have conversations spanning months. They don't start new threads and instead go back to existing threads to continue the topic. They also reference the prior threads discussion many times.

This would feel like a switching cost for people who use the system that way.

  • They need to do some sort of shared chat. Like being able to start a thread then invite another chatGPT user to join on the conversation. That would add some network effects and switching cost.

    Maybe they already have this? I'm not a paid user.

ChatGPT and Gemini has cross conversation personalization. I believe the former is off by default and the latter is on.

  • Is there more detailed information how this works? I used to assume that it can be beneficial to switch to a new chat to avoid having took much irrelevant context in the interaction. How does this personalization happen, how does it decide which parts are relevant from one conversation to another?

    It doesn't seem like there's a way to inspect or alter what kind of information Gemini had saved as "important information" about me (apart from deleting chats entirely, apparently).

    • There’s a toggle in every new Gemini chat to turn off personalization for that chat. I assume you need to make sure it’s mom globally first?

      4 replies →