Comment by hhh
1 day ago
> ChatGPT has this issue where when it's doesn't know the explanation for something, it often won't hallucinate outright, but create some long-winded confusing word salad that sounds like it could be right but you can't quite tell.
This is just hallucinating.
No comments yet
Contribute on Hacker News ↗