Comment by 4silvertooth

3 days ago

>Psychologist Sarita Robinson at the University of Central Lancashire, UK, says that hallucinations are common when people are in isolation, usually occurring if there is also sensory deprivation, such as being in a dark room.

Does the AI model hallucinations somewhat linked to this, does computer AI model too need some sort of socializing?

That's an interesting conjecture. Many of the attributes of AI that people object to (hallucination, sycophancy, psychosis, plagiarism, etc.) originate from human behavior. We see ourselves in the system and, in a way, are the ghost in the machine. So yeah, it could be that AI systems need socialization, and that may be a job for humans in the future. I'm now waiting for the first Dr. Susan Calvin.