Comment by theothertimcook
8 days ago
*Shitty start-up LLMs should not replace therapists.
There have never been more psychologists, psychiatrists, counsellors and social worker, life coach, therapy flops at any time in history and yet mental illness prevalence is at all time highs and climbing.
Just because you're a human and not an llm doesn't mean you're not a shit therapist, maybe you did your training at the peak of the replication crisis? Maybe you've got your own foibles that prevent you from being effective in the role?
Where I live, it takes 6-8 years and a couple hundred grand to become a practicing psychologist, it really is only an option for the elite, which is fine if you're counselling people from similar backgrounds, but not when you're dealing with people from lower socioeconomic classes with experiences that weren't even on your radar, and that's only if, they can afford the time and $$ to see you.
So now we have mental health social workers and all these other "helpers" who's just is to do their job, not fix people.
LLM "therapy" is going to and has to happen, the study is really just a self reported benchmarking activity, " I wouldn't have don't it that way" I wonder what the actual prevalence of similar outcomes is for human therapists?
Setting aside all of the life coach and influencer dribble that people engaged with which is undoubtedly harmful.
LLMs offer access to good enough help at cost, scale and availability that human practitioners can only dream of.
Respectfully, while I concur that there's a lot of influencer / life coach nonsense out there, I disagree that LLMs are the solution. Therapy isn't supposed to scale. It's the relationship that heals. A "relationship" with an LLM has an obvious, intrinsic, and fundamental problem.
That's not to say there isn't any place at all for use of AI in the mental health space. But they are in no way able to replace a living, empathetic human being; the dismal picture you paint of mental health workers does them a disservice. For context, my wife is an LMHC who runs a small group practice (and I have a degree in cognitive psychology though my career is in tech).
This ChatGPT interaction is illustrative of the dangers in putting trust in a LLM: https://amandaguinzburg.substack.com/p/diabolus-ex-machina
> Therapy isn't supposed to scale. It's the relationship that heals.
My understanding is that modern evidence-based therapy is basically a checklist of "common sense" advice, a few filters to check if it's the right advice ("stop being lazy" vs "stop working yourself to death" are both good advice depending on context) and some tricks to get the patient to actually listen to the advice that everyone already gives them (e.g. making the patient think they thought of it). You can lead a horse to water, but a skilled therapist's job is to get it to actually drink.
As far as I can see, the main issue I see with a lot of LMMs would be that they're fine tuned to agree with people and most people who benefit from therapy are there because they have some terrible ideas that they want to double down on.
Yes, the human connection is one of the "tricks". And while a LLM could be useful for someone who actually wants to change, I suspect a lot of people will just find it too easy to "doctor shop" until they find a LLM that tells them their bad habits and lifestyle are totally valid. I think there's probably some good in LLMs but in general they'll probably just be like using TikTok or Twitter for therapy - the danger won't be the lack of human touch but that there's too much choice for people who make bad choices.
Respectfully, that view completely trivialises a clinical profession.
Calling evidence based therapy a "checklist of advice" is like calling software engineering a "checklist for typing". A therapist's job isn't to give advice. Their skill is using clinical training to diagnose the deep cognitive and behavioural issues, then applying a structured framework to help a person work on those issues themselves.
The human connection is the most important clinical tool. The trust it builds is the foundation needed to even start that difficult work.
Source: a lifelong recipient of talk therapy.
2 replies →
Your understanding is wrong. What you’re describing is executive coaching — useful advice for already high-functioning people.
Ask a real practitioner and they’ll tell you most real therapy is exactly the thing you dismiss as a trick: human connection.
3 replies →
They’ve done studies that show the quality of the relationship between the therapist and the client has a stronger predictor of successful outcomes than the type of modality used.
Sure, they may be talking about common sense advice, but there is something else going on that affects the person on a different subconscious level.
1 reply →
> A "relationship" with an LLM has an obvious, intrinsic, and fundamental problem.
What exactly do you mean? What do you think a therapist brings to the table an LLM cannot?
Empathy? I have been participating in exchanges with AI that felt a lot more empathetic than 90% of the people I interact with every day.
Let's be honest: a therapist is not a close friend - in fact, a good therapist knows how to keep a professional distance. Their performative friendliness is as fake as the AI's friendliness, and everyone recognises that when it's invoicing time.
To be blunt, AI never tells me that ‘our time is up for this week’ after an hour of me having an emotional breakdown on the couch. How’s that for empathy?
> Empathy? I have been participating in exchanges with AI that felt a lot more empathetic than 90% of the people I interact with every day.
You must be able to see all the hedges you put in that claim.
1 reply →
> It's the relationship that heals.
Ehhh. It’s the patent who does the healing. The therapist holds open the door. You’re the one who walks into the abyss.
I’ve had some amazing therapists, and I wouldn’t trade some of those sessions for anything. But it would be a lie to say you can’t also have useful therapy sessions with chatgpt. I’ve gotten value out of talking to it about some of my issues. It’s clearly nowhere near as good as my therapist. At least not yet. But she’s expensive and needs to be booked in advance. ChatGPT is right there. It’s free. And I can talk as long as I need to, and pause and resume the session whenever want.
One person I’ve spoken to says they trust chatgpt more than a human therapist because chatgpt won’t judge them for what they say. And they feel more comfortable telling chatgpt to change its approach than they would with a human therapist, because they feel anxious about bossing a therapist around. If its the relationship which heals, why can't a relationship with chatgpt heal just as well?
That was a very interesting read, it's funny because I have done and experienced (both sides) of what the LLM did here.
Don't get me wrong there are many phenomenal mental health workers, but it's a taxing role, and the ones that are exceptional posses skills that are far more valuable not dealing with broken people, not to mention the exposure to vicarious trauma.
I think maybe "therapy" is the problem and that open source, local models developed to walk people through therapeutic tools and exercises might be the scalable help that people need.
You only need to look at some of the wild stories on the chatgpt subreddit to start to wonder at it's potential, recently read two stories of posters who self treated ongoing physical conditions using llms (back pin and jaw clicking) only to have several commenters come out and explain it helped them too.
> Therapy isn't supposed to scale.
As I see it "therapy" is already a catch-all terms for many very different things. In my experience, sometimes "it's the relationship that heals", other times it's something else.
E.g. as I understand it, cognitive behavioral therapy up there in terms of evidence base. In my experience it's more of a "learn cognitive skills" modality than an "it's the relationship that heals" modality. (As compared with, say, psychodynamic therapy.)
For better or for worse, to me CBT feels like an approach that doesn't go particularly deep, but is in some cases effective anyway. And it's subject to some valid criticism for that: in some cases it just gives the patient more tools to bury issues more deeply; functionally patching symptoms rather than addressing an underlying issue. There's tension around this even within the world of "human" therapy.
One way or another, a lot of current therapeutic practice is an attempt to "get therapy to scale", with associated compromises. Human therapists are "good enough", not "perfect". We find approaches that tend to work, gather evidence that they work, create educational materials and train people up to produce more competent practitioners of those approaches, then throw them at the world. This process is subject to the same enshittification pressures and compromises that any attempts at scaling are. (The world of "influencer" and "life coach" nonsense even more so.)
I expect something akin to "ChatGPT therapy" to ultimately fit somewhere in this landscape. My hope is that it's somewhere between self-help books and human therapy. I do hope it doesn't completely steamroll the aspects of real therapy that are grounded in "it's the [human] relationship that heals". (And I do worry that it will.) I expect LLMs to remain a pretty poor replacement for this for a long time, even in a scenario where they are "better than human" at other cognitive tasks.
But I do think some therapy modalities (not just influencer and life coach nonsense) are a place where LLMs could fit in and make things better with "scale". Whatever it is, it won't be a drop-in replacement, I think if it goes this way we'll (have to) navigate new compromises and develop new therapy modalities for this niche that are relatively easy to "teach" to an LLM, while being effective and safe.
Personally, the main reason I think replacing human therapists with LLMs would be wildly irresponsible isn't "it's the relationship that heals", its an LLM's ability to remain grounded and e.g. "escalate" when appropriate. (Like recognizing signs of a suicidal client and behaving appropriately, e.g. pulling a human into the loop. I trust self-driving cars to drive more safely than humans, and pull over when they can't [after ~$1e11 of investment]. I have less trust for an LLM-driven therapist to "pull over" at the right time.)
To me that's a bigger sense in which "you shouldn't call it therapy" if you hot-swap an LLM in place of a human. In therapy, the person on the other end is a medical practitioner with an ethical code and responsibilities. If anything, I'm relying on them to wear that hat more than I'm relying on them to wear a "capable of human relationship" hat.
>psychologists, psychiatrists, counsellors and social worker
Psychotherapy (especially actual depth work rather than CBT) is not something that is commonly available, affordable or ubiquitous. You've said so yourself. As someone who has an undergrad in psychology - and could not afford the time or fees (an additional 6 years after undergrad) to become a clinical psychologist - the world is not drowning in trained psychologists. Quite the opposite.
> I wonder what the actual prevalence of similar outcomes is for human therapists?
Theres a vast corpus on the efficacy of different therapeutic approaches. Readily googlable.
> but not when you're dealing with people from lower socioeconomic classes with experiences that weren't even on your radar
You seem to be confusing a psychotherapist with a social worker. There's nothing intrinsic to socioeconomic background that would prevent someone from understanding a psychological disorder or the experience of distress. Although I agree with the implicit point that enormous amounts of psychological suffering are due to financial circumstances.
The proliferation of 'life coaches', 'energy workers' and other such hooey is a direct result. And a direct parallel to the substitution of both alternative medicine and over the counter medications for unaffordable care.
I note you've made no actual argument for the efficacy of LLM's beyond - they exist and people will use them... Which is of course true, but also a tautology.
Youre right you can pretty much run that line backwards for scarcity/availability Shrink, Psych, Social, Counsellor.
I was shocked how many psychiatrists deal almost exclusively with treatment and titration of ADHD medication, some are 100% remote via zoom.
I've been involved with the publishing of psychology research, my faith in that system is low, see replication crisis comments, beyond that, working in/around mental health I hear of interactions where psychologists or MH social workers have "prescribed" bible study and alike so anecdotal evidence combined with my own experiences over the years.
Re: socioeconomic backgrounds, you said so yourself, many cannot afford to go route of clinical psych, increasingly the profession has become pretty exclusive and probably not for the better.
Agree regarding the snake oilers but you can't discount distrust and disenfranchisement of/from the establishment nd institutions.
'This way up' is already offering self-paced online CBT, I see LLMs as an extension of that, if only for the simple fact that a person can open a new tab and start the engagement without a referral, appointment, transport, cost, or even really any idea if how the process works.
Infact, I'm certain it is already happening based on reading the chatgpt subreddit, as for efficacy I don't think we'll ever really know, I know that I personally would be more comfortable being totally honest with a text box thank a living breathing human so who knows.i appreciate your insights though.
> it really is only an option for the elite, which is fine if you're counselling people from similar backgrounds, but not when you're dealing with people from lower socioeconomic classes with experiences that weren't even on your radar
A bizarre qualm. Why would a therapist need to be from the same socioeconomic class as their client? They aren't giving clients life advice. They're giving clients specific services that that training prepared them to provide.
they don’t need to be from the same class, but without insurance traditional once a week therapy costs as much as rent, and society wide, insurance can’t actually reduce price
Many LMHCs have moved to cash-only with sliding scale.
> They're giving clients specific services that that training prepared them to provide.
And what would that be?
Cognitive behavioral therapy, dialectic behavioral therapy, EMDR, acceptance and commitment therapy, family systems therapy, biofeedback, exposure and response prevention, couples therapy...?
1 reply →
LLMs are about as good at "therapy" as talking to a friend who doesn't understand anything about the internal, subjective experience of being human.
And yet, studies show that journaling is super effective at helping to sort out your issues. Apparently in one study, journaling was rated as effective than 70% of counselling sessions by participants. I don’t need my journal to understand anything about my internal, subjective experience. That’s my job.
Talking to a friend can be great for your mental health if your friend keeps the attention on you, asks leading questions, and reflects back what you say from time to time. ChatGPT is great at that if you prompt it right. Not as good as a skilled therapist, but good therapists and expensive and in short supply. ChatGPT is way better than nothing.
I think a lot of it comes down to promoting though. I’m untrained, but I’ve both had amazing therapists and I’ve filled that role for years in many social groups. I know what I want chatgpt to ask me when we talk about this stuff. It’s pretty good at following directions. But I bet you’d have a way worse experience if you don’t know what you need.
How would you prompt it, or what directions would you ask it to follow?
1 reply →
Also, that friend has amnesia and you know for absolute certain that the friend doesn't actually care about you in the least.
> There have never been more psychologists, psychiatrists, counsellors and social worker, life coach, therapy flops at any time in history and yet mental illness prevalence is at all time highs and climbing.
The last time I saw a house fire, there were more firefighters at that property than at any other house on the street and yet the house was on fire.
Virology, immunology, and Oncology eradicated entire illnesses and reduced cancer mortality by double digits.
Psychology, nearly crashed the peer review system, now recognises excessive use of Xbox as a mental illness.
I've tried both, and the core component that is missing is empathy. A machine can emulate empathy, but its just platitudes. An LLM will never be able to relate to you.
What if they're the same levels of mental health issues as before?
Before we'd just throw them in a padded prison.
Welcome Home, Sanitarium
"There have never been more doctors, and yet we still have all of these injuries and diseases!"
Sorry, that argument just doesn't make a lot of sense to me for a whole, while, lot of reasons.
It is similar to "we got all these super useful and productive methods to workout (weight lifting, cardio, yoga, gymnastics, martial arts, etc.) yet people drink, smoke, consume sugar, sit all day, etc.
We cannot blame X or Y. "It takes a village". It requires "me" to get my ass off the couch, it requires a friend to ask we go for a hike, and so on.
We got many solutions and many problems. We have to pick the better activity (sit vs walk)(smoke vs not)(etc..)
Having said that, LLMs can help, but the issue with relying on an LLM (imho) is that it you take a wrong path (like Interstellar's TARS the X parameter is too damn high) you can be detailed, while a decent (certified doc) therapist will redirect you to see someone else.
>What if they're the same levels of mental health issues as before?
Maybe but this raises the question of how on Earth we'd ever know we were on the right track when it comes to mental health. With physical diseases it's pretty easy to show that overall public health systems in the developed world have been broadly successful over the last 100 years. Less people die young, dramatically less children die in infancy and survival rates for a lot of diseases are much improved. Obesity is clearly a major problem, but even allowing for that the average person is likely to live longer than their great-grandparents.
It seems inherently harder to know whether the mental health industry is achieving the same level of success. If we massively expand access to therapy and everyone is still anxious/miserable/etc at what point will we be able to say "Maybe this isn't working".
Answer: Symptom management.
There's a whole lot of diseases and disorders we don't know how to cure in healthcare.
In those cases, we manage symptoms. We help people develop tools to manage their issues. Sometimes it works, sometimes it doesn't. Same as a lot of surgeries, actually.
1 reply →
Psychology has succeeded in creating new disorders while fields like virology, immunology and oncology are eradicating and improving mortality rates.
It was these professions and their predecessors doing the padded cell confinement, labotomising and etc.
This should not be considered an endorsement of technology so much as an indictment of the failure of extant social systems.
The role where humans with broad life experience and even temperaments guide those with narrower, shallower experience is an important one. While it can be filled with the modern idea of "therapist," I think that's too reliant on a capitalist world view.
Saying that LLMs fill this role better than humans can - in any context - is, at best, wishful thinking.
I wonder if "modern" humanity has lost sight of what it means to care for other humans.
> LLMs offer access to good enough help at cost, scale and availability that human practitioners can only dream of.
No