Comment by wpietri
16 days ago
I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.
16 days ago
I think it's better to say that LLMs only hallucinate. All the text they produce is entirely unverified. Humans are the ones reading the text and constructing meaning.
No comments yet
Contribute on Hacker News ↗