← Back to context

Comment by ffsm8

2 days ago

> None of this needs the model to grow strong reasoning skills. That's not where the real money is.

I never thought about it like that, but it sounds plausible.

However, I feel like getting to this stage is even harder to get right compared to reasoning?

Aside from the <0.1% of severely mentally unwell people which already imagine themselves to be in relationships with AIs, I don't think a lot of normal people will form lingering attachments to them without solving the issue of permanence and memory

They're currently essentially stateless, while that's surely enough for short term attachment, I'm not seeing this becoming a bigger issue because if that glaring shortfall.

It'd be like being in a relationship with a person with dementia, thats not a happy state of being.

Honestly, I think this trend is severely overstated until LLMs can sufficiently emulate memories and shared experiences. And that's still fundamentally impossible, just like "real" reasoning with understanding.

So I disagree after thinking about it more - emulated reasoning will likely have a bigger revenue stream via B2E applications compared to emotional attachment in B2C...

(the top post on HN right now is announcing Claude lets you buy a 1M token context. Extrapolate a few years.

Generally, there is a push towards 'context engineering' and there is a lot of bleeding edge research in snapshotting large contexts in ways to get the next back-forth turn in the conversation to be fast etc. So optimisations are already being made.)