Comment by ben_w

14 hours ago

Indeed.

With humans, every so often I find myself in a conversation where the other party has a wildly incorrect understanding of what I've said, and it can be impossible to get them out of that zone. Rare, but it happens. With LLMs, much as I like them for breadth of knowledge, it happens most days.

That said, with LLMs I can reset the conversation at any point, backtracking to when they were not misunderstanding me — but even that trick doesn't always work, so the net result is the LLM is still worse at understanding me than real humans are.