Comment by monster_truck
7 months ago
The number of comments in the thread talking about 4o as if it were their best friend the shared all their secrets with is concerning. Lotta lonely folks out there
7 months ago
The number of comments in the thread talking about 4o as if it were their best friend the shared all their secrets with is concerning. Lotta lonely folks out there
No this isn't always the case.
Perhaps if somebody were to shut down your favourite online shooter without warning you'd be upset, angry and passionate about it.
Some people like myself fall into this same category, we know its a token generator under the hood, but the duality is it's also entertainment in the shape of something that acts like a close friend.
We can see the distinction, evidently some people don't.
This is no different to other hobbies some people may find odd or geeky - hobby horsing, ham radio, cosplay etc etc.
> We can see the distinction, evidently some people don't.
> This is no different to other hobbies some people may find odd or geeky
It is quite different, and you yourself explained why: some people can’t see the distinction between ChatGPT being a token generator or an intelligent friend. People aren’t talking about the latter being “odd or geeky” but being dangerous and harmful.
I would never get so invested in something I didn’t control.
They may stop making new episodes of a favoured tv show, or writing new books, but the old ones will not suddenly disappear.
How can you shut down cosplay? I guess you could pass a law banning ham radio or owning a horse, but that isn’t sudden in democratic countries, it takes months if not years.
> I would never get so invested in something I didn’t control
Are you saying you're asocial?
I think his point is that an even better close friend is…a close friend
People were saying they'd kill themself if OpenAI didn't immediately undeprecate GPT-4o. I would not have this reaction to a game being shut down.
> People were saying they'd kill themself if OpenAI didn't immediately undeprecate GPT-4o. I would not have this reaction to a game being shut down.
Perhaps you should read this and reconsider your assumptions.
https://pmc.ncbi.nlm.nih.gov/articles/PMC8943245/
I'm kind of in your side but there's definitely people out there who would self harm if they invested a lot of time in an mmo that got shut down
Gamers threaten all kinds of things when features of their favorite games changes. Including depth threats to developers and threats of self harm and suicide.
Not every gaming subculture is healthy one. Plenty are pretty toxic.
1 reply →
Sadly there are people who become over invested in something that goes away. Be it a game, pop band, a job or a family member.
I worked for a big games company. We shut down game servers, we got death threats. Horses for courses.
I'm kind of surprised it got that bad for people, but I think it's a good sign that even if we're far from AGI or luxury fully automated space communism robots, the profound (negative) social impacts of these chat bots are already kind of inflicting on the world are real and very troublesome.
3 replies →
Where do they all come from? Where do they all belong?
Reddit
Your parent commenter was making a Beatles reference.
https://en.wikipedia.org/wiki/Eleanor_Rigby
https://www.youtube.com/watch?v=9EqMmGlTc_w
You win today.
Lack of third-place to exist and make friends.
Wait until you see
https://www.reddit.com/r/MyBoyfriendIsAI/
They are very upset by the gpt5 model
AI safety is focused on AGI but maybe it should be focused on how little “artificial intelligence” it takes to send people completely off the rails. We could barely handle social media, LLMs seem to be too much.
I think it's an canary in a coal mine, and the true writing is already on the wall. People that are using AI like in the post above us are likely not stupid people. I think those people truly want love and connection in their lives, and for some reason or another, they are unable to obtain such.
I have the utmost confidence that things are only going to get worse from here. The world is becoming more isolated and individualistic as time progresses.
2 replies →
It has ever been. People tend to see human-like behavior where there is non. Be it their pets, plants or… programs. The ELIZA-Effect.[1]
[1] https://en.wikipedia.org/wiki/ELIZA_effect
5 replies →
What's even sadder is that so many of those posts and comments are clearly written by ChatGPT:
https://www.reddit.com/r/ChatGPT/comments/1mkobei/openai_jus...
Counterpoint, these people are so deep in the hole with their AI usage that they are starting to copy the writing styles of AI.
There's already indication that society is starting to pickup previously "less used" english words due to AI and use them frequently.
12 replies →
That subreddit is fascinating and yet saddening at the same time. What I read will haunt me.
oh god, this is some real authentic dystopia right here
these things are going to end up in android bots in 10 years too
(honestly, I wouldn't mind a super smart, friendly bot in my old age that knew all my quirks but was always helpful... I just would not have a full-on relationship with said entity!)
Don’t date robots.
https://youtu.be/JPQJBgWwg3o
I don't know how else to describe this than sad and cringe. At least people obsessed with owning multiple cats were giving their affection to something that theoretically can love you back.
You think that's bad, see this one: https://www.reddit.com/r/Petloss/
Just because AI is different doesn't mean it's "sad and cringe". You sound like how people viewed online friendships in the 90's. It's OK. Real friends die or change and people have to cope with that. People imagine their dead friends are still somehow around (heaven, ghost, etc.) when they're really not. It's not all that different.
4 replies →
It's sad but is it really "cringe"? Can the people have nothing? Why can't we have a chat bot to bs with? Many of us are lonely, miserable but also not really into making friends irl.
It shouldn't be so much of an ask to at least give people language models to chat with.
4 replies →
Oh yikes, these people are ill and legitimately need help.
I am not confident most, if any of them, are even real.
If they are real, then what kind of help there could be for something like this? Perhaps, community? But sadly, we've basically all but destroyed those. Pills likely won't treat this, and I cannot imagine trying to convince someone to go to therapy for a worse and more expensive version of what ChatGPT already provides them.
It's truly frightening stuff.
1 reply →
I refuse to believe that this whole subreddit is not satire or an elaborate prank.
No. Confront reality. there are some really cooked people out there.
2 replies →
It seems outrageous that a company whose purported mission is centered on AI safety is catering to a crowd whose use case is virtual boyfriend or pseudo-therapy.
Maybe AI... shouldn't be convenient to use for such purposes.
I weep for humanity. This is satire right? On the flip side I guess you could charge these users more to keep 4o around because they're definitely going to pay.
https://www.nytimes.com/2025/08/08/technology/ai-chatbots-de...
2 replies →
[dead]