Comment by ToucanLoucan
1 day ago
The difference between hallucination and lie is important though: a hallucination is a lie with no motivation, which can make it significantly harder to detect.
If you went to a hardware store and asked for a spark plug socket without knowing the size, and a customer service person recommended an imperial set of three even though your vehicle is metric, that would be akin to an LLM's hallucination: it didn't happen for any particular reason, it just filled in information where none was available. An actual person, even one not terribly committed to their job, would ask what size or failing that, what year of car.
Not all human hallucinations are lies, though. I really think you’re not fully thinking this through. People have beliefs because of, essentially, their training data.
A good example of this is religious belief. All the evidence suggests that religious belief is essentially 100% hallucination. It may be a little different from the nature of LLM hallucinations, but in terms of quality or quantity regarding reliability of what these entities say, I don’t see much difference. Although I will say, LLMs are better at acknowledging errors than humans tend to be, although that may largely be due to training to be sycophantic.
The bottom line, though, is I don’t agree that humans are less subject to hallucinations than LLMs are. As long as a significant number of humans rabbit on about “higher powers”, afterlives, “angels”, “destiny”, etc., that’s a ridiculously difficult position to defend.