← Back to context

Comment by somenameforme

9 days ago

I think he's focusing on the distinction between facts and output for humans and drawing a parallel to LLMs.

If I ask you something that you know the answer to, the words you use and that fact iself are distinct entities. You're just giving me a presentation layer for fact #74719.

But LLMs lack any comparable pool to draw from, and so their words and their answer are essentially the same thing.