Comment by wpietri
1 month ago
I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.
1 month ago
I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.
No comments yet
Contribute on Hacker News ↗