Comment by jrflowers
7 days ago
> So for those people, the LLM is replacing having nothing, not a therapist.
Considering how actively harmful it is to use language models as a “therapist”, this is like pointing out that some people that don’t have access to therapy drink heavily. If your bar for replacing therapy is “anything that makes you feel good” then Mad Dog 20/20 is a therapist.
An extremely large accusation - do you have any evidence to suggest this is as harmful as you say?
It’s not really that contentious of a statement. Language models encouraging delusions is pretty well-documented.
https://www.rollingstone.com/culture/culture-features/ai-spi...
https://www.psychologytoday.com/us/blog/psych-unseen/202507/...
And we’re in a comment thread about a study that concluded:
>LLMs 1) express stigma toward those with mental health conditions and 2) respond inappropriately to certain common (and critical) conditions in naturalistic therapy settings
And it’s been shown to be addictive
https://www.tomshardware.com/tech-industry/artificial-intell...
So if you overheard somebody say “I don’t do that stuff because it’s addictive and people go crazy on it” you would probably assume that they were talking about a substance. Or at the very least you would not assume that they were talking about seeing a therapist.