The first link says that patients can't reliably tell which is the therapist and which is LLM in single messages, which yeah, that's an LLM core competency.
The second is "how 2 use AI 4 therapy" which, there's at least one paper for every field like that.
The last found that they were measurably worse at therapy than humans.
So, yeah, I'm comfortable agreeing that all LLMs are bad therapists, and bad friends too.
Something definitely makes me uneasy about it taking the place of interpersonal connection. But I also think the hardcore backlash involves an over correction that's dismissive of llm's actual language capabilities.
Sycophantic agreement (which I would argue is still palpably and excessively present) undermines its credibility as a source of independent judgment. But at a minimum it's capable of being a sounding board echoing your sentiments back to you with a degree of conceptual understanding that should not be lightly dismissed.
Not everyone needs the deepest, most intelligent therapist in order to improve their situation. A lot of therapy turns out to be about what you say yourself, not what a therapist says to you. It's the very act of engaging thoughtfully on your own problems that helps, not some magic that the therapist brings. So, if you could maintain a conversation with a tree, it would in many cases, be therapeutically helpful. The thing the LLM is doing, is facilitating your introspection more helpfully than a typical inanimate object. This has been borne out by studies of people who have engaged in therapy sessions with an LLM interlocutor, and reported positive results.
That said, an LLM wouldn't be appropriate in every situation, or for every affliction. At least not with the current state of the art.
> all LLMS are bad friends and therapists.
Is that just your gut feel? Because there has been some preliminary research that suggest it's, at the very least, an open question:
https://neurosciencenews.com/ai-chatgpt-psychotherapy-28415/
https://pmc.ncbi.nlm.nih.gov/articles/PMC10987499/
https://arxiv.org/html/2409.02244v2
The first link says that patients can't reliably tell which is the therapist and which is LLM in single messages, which yeah, that's an LLM core competency.
The second is "how 2 use AI 4 therapy" which, there's at least one paper for every field like that.
The last found that they were measurably worse at therapy than humans.
So, yeah, I'm comfortable agreeing that all LLMs are bad therapists, and bad friends too.
there's also been a spate of reports like this one recently https://www.papsychotherapy.org/blog/when-the-chatbot-become...
which is definitely worse than not going to a therapist
4 replies →
I do not think there are any documented cases of LLMs being reasonable friends or therapists so I think it is fair to say that:
> All LLMS are bad friends and therapists
That said it would not surprise me that LLMs in some cases are better than having nothing at all.
Something definitely makes me uneasy about it taking the place of interpersonal connection. But I also think the hardcore backlash involves an over correction that's dismissive of llm's actual language capabilities.
Sycophantic agreement (which I would argue is still palpably and excessively present) undermines its credibility as a source of independent judgment. But at a minimum it's capable of being a sounding board echoing your sentiments back to you with a degree of conceptual understanding that should not be lightly dismissed.
Though given how agreeable LLMs are, I'd imagine there are cases where they are also worse than having nothing at all as well.
1 reply →
> Is that just your gut feel?
Here's my take further down the thread: https://news.ycombinator.com/item?id=44840311
> Is that just your gut feel?
An LLM is a language model and the gestalt of human experience is not just language.
That is really a separate, unrelated issue.
Not everyone needs the deepest, most intelligent therapist in order to improve their situation. A lot of therapy turns out to be about what you say yourself, not what a therapist says to you. It's the very act of engaging thoughtfully on your own problems that helps, not some magic that the therapist brings. So, if you could maintain a conversation with a tree, it would in many cases, be therapeutically helpful. The thing the LLM is doing, is facilitating your introspection more helpfully than a typical inanimate object. This has been borne out by studies of people who have engaged in therapy sessions with an LLM interlocutor, and reported positive results.
That said, an LLM wouldn't be appropriate in every situation, or for every affliction. At least not with the current state of the art.
That is an extreme claim, what is your source for this?
Absolutes, monastic take... Yeah I imagine not a lot of people seek out your advice.