Comment by ForceBru
4 hours ago
> psychosis in the AI itself
I think you're anthropomorphising the AI too much: what does it mean for an LLM to have psychosis? This implies that LLMs have a soul, or a consciousness, or a psyche. But... do they?
Speaking of reality, one can easily become philosophical and say that we humans don't exactly "have" a reality either. All we have are sensor readings. LLMs' sensors are texts and images they get as input. They don't have the "real" world, but they do have access to tons of _representations_ of this world.
> I think you're anthropomorphising the AI too much
I don’t get it. Is that supped to be a gotchya? Have you tried maliciously messing with an LLM? You can get it into a state that resembles psychosis. I mean you give it a context that is removed from reality, yet close enough to reality to act on and it willl give you crazy output.