Comment by nolok
11 hours ago
No he does not.
He is not saying it's ok for this system to provides wrong answers, he is saying it's normal for informations from LLM to not be reliable and thus the issue is not coming from the LLM, but from the way it is being used.
No comments yet
Contribute on Hacker News ↗