Comment by angrydev
3 months ago
Exactly. Stop fooling people into thinking there’s a human typing on the other side of the screen. LLMs should be incredibly useful productivity tools, not emotional support.
3 months ago
Exactly. Stop fooling people into thinking there’s a human typing on the other side of the screen. LLMs should be incredibly useful productivity tools, not emotional support.
[dead]
[dead]
How would you propose we address the therapist shortage then?
Who ever claimed there was a therapist shortage?
The process of providing personal therapy doesn't scale well.
And I don't know if you've noticed, but the world is pretty fucked up right now.
2 replies →
https://www.statnews.com/2024/01/18/mental-health-therapist-...
i think most western governments and societies at large
It's a demand side problem. Improve society so that people feel less of a need for theapists.
Oh, so you think we should improve society somewhat, eh? But you yourself live in society. Gotcha!
I think therapists in training, or people providing crisis intervention support, can train/practice using LLMs acting as patients going through various kinds of issues. But people who need help should probably talk to real people.
Remember that a therapist is really a friend you are paying for.
Then make more friends.
>Remember that a therapist is really a friend you are paying for.
That's an awful, and awfully wrong definition that's also harmful.
It's also disrespectful and demeaning to both the professionals and people seeking help. You don't need to get a degree in friendship to be someone's friend. And having friends doesn't replace a therapist.
Please avoid saying things like that.
outlaw therapy
I don't know why you're being downvoted. Denmark's health system is pretty good except adult mental health. SOTA LLMs are definitely approaching a stage where they could help.
something something bootstraps
Food should only be for sustenance, not emotional support. We should only sell brown rice and beans, no more Oreos.
Oreos won't affirm your belief that suicide is the correct answer to your life problems, though.
That is mostly a dogmatic question, rooted in (western) culture, though. And even we have started to - begrudgingly - accept that there are cases where suicide is the correct answer to your life problems (usually as of now restricted to severe, terminal illness).
The point the OP is making is that LLMs are not reliably able to provide safe and effective emotional support as has been outlined by recent cases. We're in uncharted territory and before LLMs become emotional companions for people, we should better understand what the risks and tradeoffs are.
I wonder if statistically (hand waving here, I’m so not an expert in this field) the SOTA models do as much or as little harm as their human counterparts in terms of providing safe and effective emotional support. Totally agree we should better understand the risks and trade offs but I wouldn’t be super surprised if they are statistically no worse than us meat bags this kind of stuff.
9 replies →
They also are not reliably able to provide safe and effective productivity support.
Maybe there is a human typing on the other side, at least for some parts or all of certain responses. It's not been proven otherwise..