Comment by dingnuts
18 hours ago
deception implies intent. this is confabulation, more widely called "hallucination" until this thread.
confabulation doesn't require knowledge, which as we know, the only knowledge a language model has is the relationships between tokens, and sometimes that rhymes with reality enough to be useful, but it isn't knowledge of facts of any kind.
and never has been.
No comments yet
Contribute on Hacker News ↗