← Back to context

Comment by antonvs

21 hours ago

Not all human hallucinations are lies, though. I really think you’re not fully thinking this through. People have beliefs because of, essentially, their training data.

A good example of this is religious belief. All the evidence suggests that religious belief is essentially 100% hallucination. It may be a little different from the nature of LLM hallucinations, but in terms of quality or quantity regarding reliability of what these entities say, I don’t see much difference. Although I will say, LLMs are better at acknowledging errors than humans tend to be, although that may largely be due to training to be sycophantic.

The bottom line, though, is I don’t agree that humans are less subject to hallucinations than LLMs are. As long as a significant number of humans rabbit on about “higher powers”, afterlives, “angels”, “destiny”, etc., that’s a ridiculously difficult position to defend.