Comment by angrydev
4 hours ago
Exactly. Stop fooling people into thinking there’s a human typing on the other side of the screen. LLMs should be incredibly useful productivity tools, not emotional support.
4 hours ago
Exactly. Stop fooling people into thinking there’s a human typing on the other side of the screen. LLMs should be incredibly useful productivity tools, not emotional support.
Food should only be for sustenance, not emotional support. We should only sell brown rice and beans, no more Oreos.
The point the OP is making is that LLMs are not reliably able to provide safe and effective emotional support as has been outlined by recent cases. We're in uncharted territory and before LLMs become emotional companions for people, we should better understand what the risks and tradeoffs are.
I wonder if statistically (hand waving here, I’m so not an expert in this field) the SOTA models do as much or as little harm as their human counterparts in terms of providing safe and effective emotional support. Totally agree we should better understand the risks and trade offs but I wouldn’t be super surprised if they are statistically no worse than us meat bags this kind of stuff.
They also are not reliably able to provide safe and effective productivity support.
How would you propose we address the therapist shortage then?
I think therapists in training, or people providing crisis intervention support, can train/practice using LLMs acting as patients going through various kinds of issues. But people who need help should probably talk to real people.
Who ever claimed there was a therapist shortage?
https://www.statnews.com/2024/01/18/mental-health-therapist-...
outlaw therapy
something something bootstraps
[dead]
Maybe there is a human typing on the other side, at least for some parts or all of certain responses. It's not been proven otherwise..