Comment by belval
2 months ago
> I worry about the damage caused by these things on distressed people. What can be done?
Why? We are gregarious animals, we need social connections. ChatGPT has guardrails that keep this mostly safe and helps with the loneliness epidemic.
It's not like people doing this are likely thriving socially in the first place, better with ChatGPT than on some forum à la 4chan that will radicalize them.
I feel like this will be one of the "breaks" between generations where millennial and GenZ will be purist and call human-to-human real connections while anything with "AI" is inherently fake and unhealthy whereas Alpha and Beta will treat it as a normal part of their lives.
The tech industry's capacity to rationalize anything, including psychosis, as long as it can make money off it is truly incredible. Even the temporarily embarrassed founders that populate this message board do it openly.
> Even the temporarily embarrassed founders that populate this message board do it openly.
Not a wannabe founder, I don't even use LLMs aside from Cursor. It's a bit disheartening that instead of trying to engage at all with a thought provoking idea you went straight for the ad hominem.
There is plenty to disagree with, plenty of counter-arguments to what I wrote. You could have argued that human connection is special or exceptional even, anything really. Instead I get "temporarily embarrassed founders".
Whether you accept it or not, the phenomenon of using LLMs as a friend is getting common because they are good enough for human to get attached to. Dismissing it as psychosis is reductive.
Thinking that a text completion algorithm is your friend, or can be your friend, indicates some detachment from reality (or some truly extraordinary capability of the algorithm?). People don't have that reaction with other algorithms.
Maybe what we're really debating here isn't whether it's psychosis on the part of the human, it's whether there is something "there" on the part of the computer.
We need a Truth and Reconciliation Commission for all of this someday, and a lot of people will need to be behind bars, if there be any healing to be done.
> Truth and Reconciliation Commission for all of this someday, and a lot of people will need to be behind bars
You missed a cornerstone of Mandela's process.
Social media aka digital smoking. Facebook lying about measurable effects. No gen divide same game different flavor. Greed is good as they say. /s
https://en.wikipedia.org/wiki/Deaths_linked_to_chatbots
If you read through that list and dismiss it as people who were already mentally ill or more susceptible to this... that's what Dr. K (psychiatrist) assumed too until he looked at some recent studies: https://youtu.be/MW6FMgOzklw?si=JgpqLzMeaBLGuAAE
Clickbait title, but well researched and explained.
Fyi, the `si` query parameter is used by Google for tracking purposes and can be removed.
load-bearing "mostly"
https://www.cnn.com/2025/11/06/us/openai-chatgpt-suicide-law...
Using ChatGPT to numb social isolation is akin to using alcohol to numb anxiety.
ChatGPT isn't a social connection: LLMs don't connect with you. There is no relationship growth, just an echo chamber with one occupant.
Maybe it's a little healthier for society overall if people become withdrawn to the point of suicide by spiralling deeper into loneliness with an AI chat instead of being radicalised to mass murder by forum bots and propagandists, but those are not the only two options out there.
Join a club. It doesn't really matter what it's for, so long as you like the general gist of it (and, you know, it's not "plot terrorism"). Sit in the corner and do the club thing, and social connections will form whether you want them to or not. Be a choir nerd, be a bonsai nut, do macrame, do crossfit, find a niche thing you like that you can do in a group setting, and loneliness will fade.
Numbing it will just make it hurt worse when the feeling returns, and it'll seem like the only answer is more numbing.
> social connections will form whether you want them to or not
Not true for all people or all circumstances. People are happy to leave you in the corner while they talk amongst themselves.
> it'll seem like the only answer is more numbing
For many people, the only answer is more numbing.
This is an interesting point. Personally, I am neutral on it. I'm not sure why it has received so many downvotes.
You raise a good point about a forum with real people that can radicalise someone. I would offer a dark alternative: It is only a matter of time when forums are essentially replaced by an AI-generated product that is finely tuned to each participant. Something a bit like Ready Player One.
Your last paragraph: What is the meaning of "Alpha and Beta"? I only know it from the context of Red Pill dating advice.
Gen Alpha is people born roughly 2010-2020, younger than gen Z, raised on social media and smartphones. Gen Beta is proposed for people being born now.
Radicalising forums are already filled with bots, but there's no need to finely tune them to each participant because group behaviours are already well understood and easily manipulated.