Comment by wpietri
8 hours ago
I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.
8 hours ago
I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.
No comments yet
Contribute on Hacker News ↗