Comment by wintermute22
7 days ago
"The real question is can they do a better job than no therapist. That's the option people face."
This is the right question.
The answer is most definitely no, LLMs are not set up to deal with the nuances of the human psyche. We're in real danger of LLM accidentally reinforcing dangerous lines of thinking. It's a matter of time till we get a "ChatGPT made me do it" headline.
Too many AI hype folks out there thinking that humans don't need humans, we are social creatures, even as introverts. Interacting with an LLM is like talking to an evil mirror.
Already seeing tons of news stories about 'ChatGPT' inducing psychosis. The one that sticks in my mind was the 35-year old in Florida that was gunned down by policy after his AI girlfriend claimed to be being killed by OpenAI.
Now, I don't think a person with chronic major depression or someone with schizophrenia is going to get what they need from ChatGPT, but those are extremes, when most people using ChatGPT have non-extreme problems. It's the same thing that the self-help industry has tried to address for decades. There are self-help books on all sorts of topics that one might see a therapist for - anxiety, grief, marriage difficulty - these are the kinds of things that ChatGPT can help with because it tends to give the same sort of advice.
> It's a matter of time till we get a "ChatGPT made me do it" headline.
Brother, we are here already.