← Back to context

Comment by j-pb

2 months ago

After having spoken with one of the people there I'm a lot less concerned to be honest.

They described it as something akin to an emotional vibrator, that they didn't attribute any sentience to, and that didn't trigger their PTSD that they normally experienced when dating men.

If AI can provide emotional support and an outlet for survivors who would otherwise not be able to have that kind of emotional need fulfilled, then I don't see any issue.

Most people who develop AI psychosis have a period of healthy use beforehand. It becomes very dangerous when a person decreases their time with their real friends to spend more time with the chatbot, as you have no one to keep you in check with what reality is and it can create a feedback loop.

I think there's a difference between "support" and "enabling".

It is well documented that family members of someone suffering from an addiction will often do their best at shielding the person from the consequences of their acts. While well-intentioned ("If I don't pay this debt they'll have an eviction on their record and will never find a place again"), these acts prevent the addict from seeking help because, without consequences, the addict has no reason to change their ways. Actually helping them requires, paradoxically, to let them hit rock bottom.

An "emotional vibrator" that (for instance) dampens that person's loneliness is likely to result in that person taking longer (if ever) to seek help for their PTSD. IMHO it may look like help when it's actually enabling them.

The problem is that chatbots don't provide emotional support. To support someone with PTSD you help them gradually untangle the strong feelings around a stimulus and develop a less strong response. It's not fast and it's not linear but it requires a mix of empathy and facilitation.

Using an LLM for social interaction instead of real treatment is like taking heroin because you broke your leg, and not getting it set or immobilized.

  • > To support someone with PTSD you help them gradually untangle the strong feelings around a stimulus and develop a less strong response.

    It's about replaying frightening thoughts and activities in safe environment. When the brain notices they don't trigger suffering it fears them less in the future. Chatbot can provide such safe environment.

    • > Chatbot can provide such safe environment.

      It really can't. No amount of romancing a sycophantic robot is going to prepare someone to actually talk to a human being.

  • >instead of real treatment

    As yes, because America is well known for actually providing that at a reasonable price and availability...

It may not be a concern now, but it comes down to their level of maintaining critical thinking. The risk of epistemic drift, when you have a system that is designed (or reinforced) to empathize with you, can create long-term effects not noticed in any single interaction.

Related: "Delusions by design? How everyday AIs might be fuelling psychosis (and what can be done about it)" ( https://doi.org/10.31234/osf.io/cmy7n_v5 )

  • I don't disagree that AI psychosis is real, I've met people who believed that they were going to publish at Neurips due to the nonsense ChatGPT told them, that believed that the UI mockup that claude gave then were actually producing insights into it's inner workings instead of just being blinking SVGs, and I even encountered someone participating at a startup event with an Idea that I'm 100% is AI slop.

    My point was just that the interaction I had from r/myboyfriendisai wans't one of those delusional ones. For that I would take r/artificialsentience as a much better example. That place is absolutely nuts.

    • Dear god, there's more! I'll need a drink for this one.

      However, I suspect I have better resistance to schizo posts than emotionally weird posts.

  • Wouldn't there necessarily be correlative effects in professional settings a la programming?

    • Not necessarily: transactional, impersonal directions to a machine to complete a task don't automatically imply, in my mind, the sorts of feedback loops necessary to induce AI psychosis.

      All CASE tools, however, displace human skills, and all unused skills atrophy. I struggle to read code without syntax highlighting after decades of using it to replace my own ability to parse syntactic elements.

      Perhaps the slow shift risk is to one of poor comprehension. Using LLMs for language comprehension tasks - summarising, producing boilerplate (text or code), and the like - I think shifts one's mindset to avoiding such tasks, eventually eroding the skills needed to do them. Not something one would notice per interaction, but that might result in a major change in behaviour.

      2 replies →

    • Acceptance of vibe coding prompt-response answers from chatbots without understanding the underlying mechanisms comes to mind as akin to accepting the advice of a chatbot therapist without critically thinking about the response.

> If AI can provide emotional support and an outlet for survivors who would otherwise not be able to have that kind of emotional need fulfilled, then I don't see any issue.

Surely something that can be good can also be bad at the same time? Like the same way wrapping yourself in bubble wrap before leaving the house will provably reduce your incidence of getting scratched and cut outside, but there's also reasons you shouldn't do that...

Why do so many women have ptsd from dating?

  • "PTSD" is going through the same semantic inflation as the word "trauma". Or perhaps you could say the common meaning is an increasingly more inflated version of the professional meaning. Not surprising since these two are sort of the same thing.

    BTW, a more relevant word here is schizoid / schizoidism, not to be confused with schizophrenia. Or at least very strongly avoidant attachment style.

  • [flagged]

    • The parent post is getting flack, but it’s hard to see why it is controversial. I have heard “women want a man who will provide and protect” from every single woman I have ever dated or been married to, from every female friend with whom I could have such deep conversations, and from the literature I read in my anthropology-adjacent academic field. At some point one feels one has enough data to reasonably assume it’s a heterosexual human universal (in the typological sense, i.e. not denying the existence of many exceptions).

      I can believe that many women are having a hard time under modernity, because so many men no longer feel bound by the former expectations of old-school protector and provider behavior. Some men, like me, now view relationships as two autonomous individuals coming together to share sublime things like hobbies, art or travel, but don’t want to be viewed as a source of security. Other men might be just extracting casual sex from women and then will quickly move on. There’s much less social pressure on men to act a certain way, which in turn impacts on what women experience.

      6 replies →

    • Nonsense. Chimpanzees and Bonobos are our distant ancestors. Have a look at how they operate.

      From what I can tell, men have cause significant damage to women's psyche. Men that turn women into a commodity plaything instead of a fellow human being.

      Women are human beings just like men, they aren't some alien species. Trauma hurts their psyche, not pleasure. If women were in a safe, supportive, mature society, some would be monogamous, some would be poly, some would be non-committal (but honest), and some would be totally loose. Just like men. In every case they would be safe to be who they are without abuse.

      Instead, and this is where men and women deviate, it is not safe. Men will often kill or crush women, physically, professionally, and often at random. Women are not allowed to walk around at night because some men having a bad day or a wild night may not be able to control themselves, and most of society is just okay with this. Police in large swaths of the world do not help make anything safer, in fact they make it more dangerous.

      The only reason women who are more monogamous can seem better off is because society does not make room for those who aren't that way. And there are many who aren't that way. There are many who are forced to mask as that way because it is often dangerous otherwise. At large, a prison for women has been created. I think that people may even enjoy how dangerous it is, in order to force women to seek the safety of a man.

      Most of society doesn't make room for liberated women and it is heartbreaking. I will dream of a future where I can meet women as total equals, in all walks of life, without disproportionate power, where all of us as humans are free to be who we are in totality.

      17 replies →

    • > nobody is yet ready to have a serious discussion about this.

      There are a ton of people that are happy to have serious discussions about how their superior knowledge of biology gives them oracular insight into the minds of women. These discussions happen every day in Discord chats full of pubescent boys, Discord chats full of young men, and YouTube comments sections full of older men.

      1 reply →

    • From what I'm seeing the boys are getting much more damage. Even your comment smells a bit of projection.

phew, that's a healthy start.

I am still slightly worried about accepting emotional support from a bot. I don't know if that slope is slippery enough to end in some permanent damage to my relationships and I am honestly not willing to try it at all even.

That being said, I am fairly healthy in this regard. I can't imagine how it would go for other people with serious problems.

  • A friend broke up with her partner. She said she was using ChatGPT as a therapist. She showed me a screenshot, ChatGPT wrote "Oh [name], I can feel how raw the pain is!".

    WTF, no you don't bot, you're a hunk of metal!

    • all humans want sometimes, is to be told that what they're feeling is real or not. A sense of validation. It doesn't necessarily matter that much if its an actual person doing it or not.

      1 reply →

  • I completely agree that it is certainly something to be mindful of. It's just that found the people from there were a lot less delusional than the people from e.g. r/artificialsentience, which always believed that AI Moses was giving them some kind of tech revelation though magical alchemical AI symbols.