Comment by notanastronaut
20 hours ago
LLMs are dangerous in that the basically mirror and mimic whatever you put in. Paranoia in, magnified paranoia out. A mostly stable person undergoing a temporary crisis could spiral down further after the model picks up on and reflects it back in an attempt to be helpful per its overarching command to be so.
And if a person is already unbalanced it could definitely push them off the cliff into very unhealthy territory. I wouldn't be surprised if the reported incidents of people thinking they're being gang stalked doesn't increase as model usage increases.
Let alone spiritual guidance and all its trappings with mysticism.
It can be helpful in some ways but you have to understand the majority of it is bullshit and any insight you gleam from it, you put there, you just may not realize it. They're basically rubber duckies with a keyboard.
No comments yet
Contribute on Hacker News ↗