Comment by zug_zug
8 days ago
Rather than here a bunch of emotional/theoretical arguments, I'd love to hear the preferences of people here who have both been to therapy and talked to an LLM about their frustrations and how those experiences stack up.
My limited personal experience is that LLMs are better than the average therapsit.
My experiences are fairly limited with both, but I do have that insight available I guess.
Real therapist came first, prior to LLMs, so this was years ago. The therapist I went to didn't exactly explain to me what therapy really is and what she can do for me. We were both operating on shared expectations that she later revealed were not actually shared. When I heard from a friend after this that "in the end, you're the one who's responsible for your own mental health", it especially stuck with me. I was expecting revelatory conversations, big philosophical breakthroughs. Not how it works. Nothing like physical ailments either. There's simply no direct helping someone in that way, which was pretty rough to recognize. We're not Rubik's Cubes waiting to be solved, certainly not for now anyways. And there was and is no one who in the literal sense can actually help me.
With LLMs, I had different expectations, so the end results meshed with me better too. I'm not completely ignorant to the tech either, so that helps. The good thing is that it's always readily available, presents as high effort, generally says the right things, has infinite "patience and compassion" available, and is free. The bad thing is that everything it says feels crushingly hollow. I'm not the kind to parrot the "AI is soulless" mantra, but when it comes to these topics, it trying to cheer me up felt extremely frustrating. At the same time though, I was able to ask for a bunch of reasonable things, and would get reasonable presenting responses that I didn't think of. What am I supposed to do? Why are people like this and that? And I'd be then able to explore some coping mechanisms, habit strategies, and alternative perspectives.
I'm sure there are people who are a lot less able to treat LLMs in their place or are significantly more in need for professional therapy than I am, but I'm incredibly glad this capability exists. I really don't like weighing on my peers at the frequency I get certain thoughts. They don't deserve to have to put up with them, they have their own life going on. I want them to enjoy whatever happiness they have going on, not worry or weigh them down. It also just gets stale after a while. Not really an issue with a virtual conversational partner.
What does "better" mean to you though?
Is it - "I was upset about something and I had a conversation with the LLM (or human therapist) and now I feel less distressed." Or is it "I learned some skills so that I don't end up in these situations in the first place, or they don't upset me as much."?
Because if it's the first, then that might be beneficial but it might also be a crutch. You have something that will always help you feel better so you don't actually have to deal with the root issue.
That can certainly happen with human therapists, but I worry that the people-pleasing nature of LLMs, the lack of introspection, and the limited context window make it much more likely that they are giving you what you want in the moment, but not what you actually need.
See this is why I said what I said in my question -- because it sounds to me like a lot of people with strong opinions who haven't talked to many therapists.
I had one who just kinda listened and said next to nothing other than generalizations of what I said, and then suggested I buy a generic CBT workbook off of amazon to track my feelings.
Another one was mid-negotiations/strike with Kaiser and I had to lie and say I hadn't had any weed in the last year(!) to even have Kaiser let me talk to him, and TBH it seemed like he had a lot going on on his own plate.
I think it's super easy to make an argument based off of goodwill hunting or some hypothetical human therapist in your head.
So to answer your question -- none of the three made a lasting difference, but chatGPT at least is able to be a sounding-board/rubber-duck in a way that helped me articulate and discover my own feelings and provide temporary clarity.
For a relatively literate and high-functioning patient, I think that LLMs can deliver good quality psychotherapy that would be within the range of acceptable practice for a trained human. For patients outside of that cohort, there are some significant safety and quality issues.
The obvious example of patients experiencing acute psychosis has been fairly well reported - LLMs aren't trained to identify acutely unwell users and will tend to entertain delusions rather than saying "you need to call an ambulance right now, because you're a danger to yourself and/or other people". I don't think that this issue is insurmountable, but there are some prickly ethical and legal issues with fine-tuning a model to call 911 on behalf of a user.
The much more widespread issue IMO is users with limited literacy, or a weak understanding of what they're trying to achieve through psychotherapy. A general-purpose LLM can provide a very accurate simulacrum of psychotherapeutic best practice, but it needs to be prompted appropriately. If you just start telling ChatGPT about your problems, you're likely to get a sympathetic ear rather than anything that would really resemble psychotherapy.
For the kind of people who use HN, I have few reservations about recommending LLMs as a tool for addressing common mental illnesses. I think most of us are savvy enough to use good prompts, keep the model on track and recognise the shortcomings of a very sophisticated guess-the-next-word machine. LLM-assisted self help is plausibly a better option than most human psychotherapists for relatively high-agency individuals. For a general audience, I'm much more cautious and I'm not at all confident that the risks outweigh the benefits. A number of medtech companies are working on LLM-based psychotherapy tools and I think that many of them will develop products that fly through FDA approval with excellent safety and efficacy data, but ChatGPT is not that product.
I made another comment about this, but I went to a psychologist as a teen and found it absolutely useless. To be fair, I was sent for silly reasons - I was tired all the time and it was an actual undiagnosed medical issue they just figured was depression - but if I was depressed I think it perhaps would have made it worse. I don't need to sit there and talk about what's going on in my life, with very little feedback. I can effectively do that in my own head.
I just asked an LLM about a specific mental health thing that was bothering me and it gave me some actual tips that might help. It was instant, helpful, and cheap. While I'm sure someone with severe depression or anxiety should see someone that won't forget what was said several thousand tokens ago, I think LLMs will be super helpful for the mental health for the majority of people.
> I'd love to hear the preferences of people here who have both been to therapy and talked to an LLM about their frustrations and how those experiences stack up.
I've spent years on and off talking to some incredible therapists. And I've had some pretty useless therapists too. I've also talked to chatgpt about my issues for about 3 hours in total.
In my opinon, ChatGPT is somewhere in the middle between a great and a useless therapist. Its nowhere near as good as some of the incredible therapists I’ve had. But I’ve still had some really productive therapy conversations with chatgpt. Not enough to replace my therapist - but it works in a pinch. It helps that I don’t have to book in advance or pay. In a crisis, ChatGPT is right there.
With Chatgpt, the big caveat is that you get what you prompt. It has all the knowledge it needs, but it doesn’t have good instincts for what comes next in a therapy conversation. When it’s not sure, it often defaults to affirmation, which often isn’t helpful or constructive. I find I kind of have to ride it a bit. I say things like “stop affirming me. Ask more challenging questions.” Or “I’m not ready to move on from this. Can you reflect back what you heard me say?”. Or “please use the IFS technique to guide this conversation.”
With ChatGPT, you get out what you put in. Most people have probably never had a good therapist. They’re far more rare than they should be. But unfortunately that also means most people probably don’t know how to prompt chatgpt to be useful either. I think there would be massive value in a better finetune here to get chatgpt to act more like the best therapists I know.
I’d share my chatgpt sessions but they’re obviously quite personal. I add comments to guide ChatGPT’s responses about every 3-4 messages. When I do that, I find it’s quite useful. Much more useful than some paid human therapy sessions. But my great therapist? I don't need to prompt her at all. Its the other way around.
They were trained in a large and not insignificant part on reddit content. You only need to look at the kind of advice reddit gives for any kind of relationship questions to know this is asking for trouble.
> You only need to look at the kind of advice reddit gives for any kind of relationship questions to know this is asking for trouble.
This depends on the subreddit.
Suggest good ones, perhaps?
All the ones I've seen are terrible. But that's likely because I don't subscribe to them and they are large enough to occasionally show up on Reddit's front page. Anything that shows up on the front page is crap.
The point, though, is that the LLM likely has a larger exposure to the larger subreddits than the better smaller ones, and perhaps is more likely to reflect the attitudes of the larger ones.
Even if that is true, the LLMs have still be trained on all of the subreddits