Comment by i80and
8 hours ago
There's something very dark about a machine accessible in everybody's pocket that roleplays whatever role they happen to fall into: the ultimate bad friend, the terminal yes-and-er. No belief, no inner desires, just pure sycophancy.
I see people on here pretty regularly talk about using ChatGPT for therapy, and I can't imagine a faster way to cook your own brain unless you have truly remarkable self-discipline. At which point, why are you turning to the black box for help?
[dead]
[flagged]
Isn't it just like diary-writing or memo-writing, as far as therapy goes, the point being to crystallise thoughts and cathartise emotions. Is it really so bad to have a textual nodding dog to bat against as part of that process? {The very real issue of the OP aside.}
Could you expand on why you feel this is the fastest way to "cook your own brain"?
The mind is much more sensitive to writing it didn’t produce itself. If it produced the writing, then it is at least somewhat aware of the emotional state of the writer and can contextualize. If it is reading it from an outside “observer” it assumes far more objectivity, especially when the motive for seeking the observer perspective was for some therapeutic reason, even if they know that at best they’ll be getting pseudo-therapy.
It is obviously very different to solo writing. The burden should be on you to explain why it’s so similar that this line of conversation is worthwhile.
The burden? We're not in court, to me it seems similar so I was asking the commenter for a response.
I've used LLMs in this way a couple of times. I'd like to see responses; there's obviously no obligation to 'defend', but the OP (or others) may have wished to ... like a conversation.
Somewhat ironically, this is a way that LLMs are preferred and why people use them (eg instead of StackOverflow) - because you don't get berated for being inquisitive.
1 reply →
If you have unusual self-discipline and mental rigor, yes, you can use LLMs as a rubber duck that way. I would be severely skeptical of the value over a diary. But humans are, in an astonishing twist, wired to assume that if they're being replied to, there's a mind like theirs behind those replies.
The more subjective the topic, the more volatile the user's state of mind, the more likely they are to gaze too deep into that face on the other side of their funhouse mirror and think it actually is their friend, and that it "thinks" like they do.
I'm not even anti-LLM as an underlying technology, but the way chatbot companies are operating in practice is kind of a novel attack on our social brains and it behooves a warning!
>humans are, in an astonishing twist, wired to assume that if they're being replied to, there's a mind like theirs behind those replies
Interesting, not part of my experience really (though I'll need to reflect on it); thanks for sharing. It's a little like when people discover their aphantasia isn't the common experience of most other people. I tend towards strong skepticism (I'm fond of pyrrhonism), but assume others to be weakly sceptical rather than blindly accepting.
>humans are, in an astonishing twist, wired to assume that if they're being replied to, there's a mind like theirs behind those replies
Interesting, not part of my experience really (though I'll need to reflect on it); thanks for sharing. It's a little like when people discover their aphantasia isn't the common experience of most other people.
If I write in a diary it does not write back at me.
A diary is there for you to reflect, introspect or reminisce. It doesn't actively reinforce your bad or good thoughts. If it does, it can easily trick your mind into thinking it as validation of those negative thoughts.
If someone still wants to consider an LLM as a diary, treat it as if you are writing in tom riddle's diary.
Therapy is not a process where you only pour yourself out to a person and empty yourself. Even if you don't use any drugs, the therapist guides you through your emotions, mostly in a pretty neutral manner, but not without nuance.
The therapist nudges you in the right direction to face yourself, but in a safe manner, by staggering the process or slowing you down and changing your path.
A sycophant auto-complete has none of these qualities bar a slapped on "guardrails" to abruptly kick you to another subject like a pinball bumper. It can't think, sense danger or provide real feedback and support.
If you need a hole which you can empty yourself, but healthy or self-aware outside, you can provide your personal information and training data to an AI company. Otherwise the whole thing is very dangerous for a deluded and unstable mind.
On the other hand, solo writing needs you to think, filter and write. You need to be aware about yourself, or pause and think deeper to root things out. Yes, it's not smooth all the time, and the things you write are not easy to pour out, but at least you are with yourself, and you can see what's coming out and where you are. Moreover, using pen and paper creates a deeper state of mind when compared to typing on a keyboard, so it's even deeper on that regard.
Sorry, I was not likening LLMs to the entire gamut of therapy, only saying they seem - to me - to be a tool akin to that of diary-writing.
Interesting idea about pen&paper - I've been using computer keyboards (and way back an occasional typewriter) for most of my life and have written way more through a keyboard; it's more immersive for me as I don't have to think where as with a pen I struggle to legibly express myself and can't get the words out as quickly. (I'm swiping on a phone now, which is horrible; even worse than using a pen!)
1 reply →
No it isn't just like that
Because?
Why even comment on a social forum if you're not going to say something substantive?