Comment by chrisweekly
8 days ago
Respectfully, while I concur that there's a lot of influencer / life coach nonsense out there, I disagree that LLMs are the solution. Therapy isn't supposed to scale. It's the relationship that heals. A "relationship" with an LLM has an obvious, intrinsic, and fundamental problem.
That's not to say there isn't any place at all for use of AI in the mental health space. But they are in no way able to replace a living, empathetic human being; the dismal picture you paint of mental health workers does them a disservice. For context, my wife is an LMHC who runs a small group practice (and I have a degree in cognitive psychology though my career is in tech).
This ChatGPT interaction is illustrative of the dangers in putting trust in a LLM: https://amandaguinzburg.substack.com/p/diabolus-ex-machina
> Therapy isn't supposed to scale. It's the relationship that heals.
My understanding is that modern evidence-based therapy is basically a checklist of "common sense" advice, a few filters to check if it's the right advice ("stop being lazy" vs "stop working yourself to death" are both good advice depending on context) and some tricks to get the patient to actually listen to the advice that everyone already gives them (e.g. making the patient think they thought of it). You can lead a horse to water, but a skilled therapist's job is to get it to actually drink.
As far as I can see, the main issue I see with a lot of LMMs would be that they're fine tuned to agree with people and most people who benefit from therapy are there because they have some terrible ideas that they want to double down on.
Yes, the human connection is one of the "tricks". And while a LLM could be useful for someone who actually wants to change, I suspect a lot of people will just find it too easy to "doctor shop" until they find a LLM that tells them their bad habits and lifestyle are totally valid. I think there's probably some good in LLMs but in general they'll probably just be like using TikTok or Twitter for therapy - the danger won't be the lack of human touch but that there's too much choice for people who make bad choices.
Respectfully, that view completely trivialises a clinical profession.
Calling evidence based therapy a "checklist of advice" is like calling software engineering a "checklist for typing". A therapist's job isn't to give advice. Their skill is using clinical training to diagnose the deep cognitive and behavioural issues, then applying a structured framework to help a person work on those issues themselves.
The human connection is the most important clinical tool. The trust it builds is the foundation needed to even start that difficult work.
Source: a lifelong recipient of talk therapy.
>Source: a lifelong recipient of talk therapy.
All the data we have shows that psychotherapy outcomes follow a predictable dose-response curve. The benefits of long-term psychotherapy are statistically indistinguishable from a short course of treatment, because the marginal utility of each additional session of treatment rapidly approaches zero. Lots of people believe that the purpose of psychotherapy is to uncover deep issues and that this process takes years, but the evidence overwhelmingly contradicts this - nearly all of the benefits of psychotherapy occur early in treatment.
https://pubmed.ncbi.nlm.nih.gov/30661486/
1 reply →
Your understanding is wrong. What you’re describing is executive coaching — useful advice for already high-functioning people.
Ask a real practitioner and they’ll tell you most real therapy is exactly the thing you dismiss as a trick: human connection.
No, what they're describing is manualized CBT. We have abundant evidence that there is little or no difference in outcomes between therapy delivered by a "real practitioner" and basic CBT delivered by a nurse or social worker with very basic training, or even an app.
https://pubmed.ncbi.nlm.nih.gov/23252357/
2 replies →
They’ve done studies that show the quality of the relationship between the therapist and the client has a stronger predictor of successful outcomes than the type of modality used.
Sure, they may be talking about common sense advice, but there is something else going on that affects the person on a different subconscious level.
How do you measure the "quality of the relationship"? It seems like whatever metric is used, it is likely to correlate with whatever is used to measure "successful outcomes".
> A "relationship" with an LLM has an obvious, intrinsic, and fundamental problem.
What exactly do you mean? What do you think a therapist brings to the table an LLM cannot?
Empathy? I have been participating in exchanges with AI that felt a lot more empathetic than 90% of the people I interact with every day.
Let's be honest: a therapist is not a close friend - in fact, a good therapist knows how to keep a professional distance. Their performative friendliness is as fake as the AI's friendliness, and everyone recognises that when it's invoicing time.
To be blunt, AI never tells me that ‘our time is up for this week’ after an hour of me having an emotional breakdown on the couch. How’s that for empathy?
> Empathy? I have been participating in exchanges with AI that felt a lot more empathetic than 90% of the people I interact with every day.
You must be able to see all the hedges you put in that claim.
You're misreading my intent - this isn't adversarial rhetoric. I'm not making a universal claim that every LLM is always more empathetic than any human. There's nothing to disprove or falsify here because I'm clearly describing a subjective experience.
What I'm saying is that, in my observation, the curve leans in favour of LLMs when it comes to consistent friendliness or reasonably perceived (simulation of) empathy. Most people simply don't aim for that as a default mode. LLMs, on the other hand, are usually tuned to be patient, attentive, and reasonably kind. That alone gives them, in many cases, a distinct edge in how empathetic they feel — especially when someone is in a vulnerable state and just needs space and a kind voice.
> It's the relationship that heals.
Ehhh. It’s the patent who does the healing. The therapist holds open the door. You’re the one who walks into the abyss.
I’ve had some amazing therapists, and I wouldn’t trade some of those sessions for anything. But it would be a lie to say you can’t also have useful therapy sessions with chatgpt. I’ve gotten value out of talking to it about some of my issues. It’s clearly nowhere near as good as my therapist. At least not yet. But she’s expensive and needs to be booked in advance. ChatGPT is right there. It’s free. And I can talk as long as I need to, and pause and resume the session whenever want.
One person I’ve spoken to says they trust chatgpt more than a human therapist because chatgpt won’t judge them for what they say. And they feel more comfortable telling chatgpt to change its approach than they would with a human therapist, because they feel anxious about bossing a therapist around. If its the relationship which heals, why can't a relationship with chatgpt heal just as well?
That was a very interesting read, it's funny because I have done and experienced (both sides) of what the LLM did here.
Don't get me wrong there are many phenomenal mental health workers, but it's a taxing role, and the ones that are exceptional posses skills that are far more valuable not dealing with broken people, not to mention the exposure to vicarious trauma.
I think maybe "therapy" is the problem and that open source, local models developed to walk people through therapeutic tools and exercises might be the scalable help that people need.
You only need to look at some of the wild stories on the chatgpt subreddit to start to wonder at it's potential, recently read two stories of posters who self treated ongoing physical conditions using llms (back pin and jaw clicking) only to have several commenters come out and explain it helped them too.
> Therapy isn't supposed to scale.
As I see it "therapy" is already a catch-all terms for many very different things. In my experience, sometimes "it's the relationship that heals", other times it's something else.
E.g. as I understand it, cognitive behavioral therapy up there in terms of evidence base. In my experience it's more of a "learn cognitive skills" modality than an "it's the relationship that heals" modality. (As compared with, say, psychodynamic therapy.)
For better or for worse, to me CBT feels like an approach that doesn't go particularly deep, but is in some cases effective anyway. And it's subject to some valid criticism for that: in some cases it just gives the patient more tools to bury issues more deeply; functionally patching symptoms rather than addressing an underlying issue. There's tension around this even within the world of "human" therapy.
One way or another, a lot of current therapeutic practice is an attempt to "get therapy to scale", with associated compromises. Human therapists are "good enough", not "perfect". We find approaches that tend to work, gather evidence that they work, create educational materials and train people up to produce more competent practitioners of those approaches, then throw them at the world. This process is subject to the same enshittification pressures and compromises that any attempts at scaling are. (The world of "influencer" and "life coach" nonsense even more so.)
I expect something akin to "ChatGPT therapy" to ultimately fit somewhere in this landscape. My hope is that it's somewhere between self-help books and human therapy. I do hope it doesn't completely steamroll the aspects of real therapy that are grounded in "it's the [human] relationship that heals". (And I do worry that it will.) I expect LLMs to remain a pretty poor replacement for this for a long time, even in a scenario where they are "better than human" at other cognitive tasks.
But I do think some therapy modalities (not just influencer and life coach nonsense) are a place where LLMs could fit in and make things better with "scale". Whatever it is, it won't be a drop-in replacement, I think if it goes this way we'll (have to) navigate new compromises and develop new therapy modalities for this niche that are relatively easy to "teach" to an LLM, while being effective and safe.
Personally, the main reason I think replacing human therapists with LLMs would be wildly irresponsible isn't "it's the relationship that heals", its an LLM's ability to remain grounded and e.g. "escalate" when appropriate. (Like recognizing signs of a suicidal client and behaving appropriately, e.g. pulling a human into the loop. I trust self-driving cars to drive more safely than humans, and pull over when they can't [after ~$1e11 of investment]. I have less trust for an LLM-driven therapist to "pull over" at the right time.)
To me that's a bigger sense in which "you shouldn't call it therapy" if you hot-swap an LLM in place of a human. In therapy, the person on the other end is a medical practitioner with an ethical code and responsibilities. If anything, I'm relying on them to wear that hat more than I'm relying on them to wear a "capable of human relationship" hat.