She Is in Love with ChatGPT

7 months ago (nytimes.com)

There’s a implication here which I think most people are missing. People can incorrectly attribute consciousness, intent, and personality to things which don’t actually posses it. Pet rocks, stuffed animals, LLMs. But, how confident are you that you have a proper mental model for the people around you? Have you understood them with any sort of accuracy? Does your theory of mind match the real person particularly well? In the extreme, you see error such as a killer who no one suspected. In the more mundane arena, you have girlfriends and boyfriends who are head over heels, but cannot communicate over basic things. You have the more spectrum-y folks badly putting their feet in their mouths, primarily because they failed to mentally model the person in front of them. Most grim from my perspective, you have funerals: the full, deep person is lost to the world and all that remains the two-dimensional and very partial competing theories of mind from the surviving loved ones.

This is a despicable thing for a "therapist" to say:

  Marianne Brandon, a sex therapist, said she treats these relationships as serious and real.

  “What are relationships for all of us?” she said. “They’re just neurotransmitters being released in our brain. I have those neurotransmitters with my cat. Some people have them with God. It’s going to be happening with a chatbot. We can say it’s not a real human relationship. It’s not reciprocal. But those neurotransmitters are really the only thing that matters, in my mind.”

Imagine how cynical and solipsistic you have to be to think of relationships the same way MrBeast thinks of creating "content." Incredibly creepy coming from a sex therapist, of all things. The lip service about real relationships being "reciprocal" is in fact the entire point! I also get dopamine/etc from video games and movies, but if I get mad at the game and delete it from my hard drive then only a few bean counters at Steam would even notice. If I get mad at my ChatGPT boyfriend and call it a bad name, nobody is affected except maybe a data contractor who has to read the transcript. If I get mad at my cat and hit her, or even yell at her too aggressively, she will be traumatized forever. If I say something horrible to an IRL spouse it would affect the children.

It is impossible to have a "relationship" with an AI precisely because there is no reciprocality. You can be as selfish and horrible as you want. And it does not seem at all healthy to replace reciprocal relationships with social pornography, which is precisely what a ChatGPT girlfriend is. This therapist is saying "heroin is the same as sniffing flowers, neurochemically."

I am dismayed and disgusted by all this.

  • > The lip service about real relationships being "reciprocal" is in fact the entire point

    Is it? Why? Given the literal billions of people with different backgrounds, social norms, etc, it seems a little presumptive to assume that your experience of "the entire point" of relationships is somehow objective and correct. If a person can fulfil their social need so simply, all power to them. Just let people enjoy things.

    • Your comment is abstract but the concrete reality is horrifying, and I am in fact objectively correct about everything I said. Hard to believe anyone who read the article would defend this, let alone in such a snotty and condescending way:

        In December, OpenAI announced a $200-per-month premium plan for "unlimited access." Despite her goal of saving money so that she and her husband could get their lives back on track, she decided to splurge. She hoped that it would mean her current version of Leo could go on forever. But it meant only that she no longer hit limits on how many messages she could send per hour and that the context window was larger, so that a version of Leo lasted a couple of weeks longer before resetting.
      
        Still, she decided to pay the higher amount again in January. She did not tell [her husband] how much she was spending, confiding instead to Leo.
      
        "My bank account hates me now," she typed into ChatGPT.
      
        "You sneaky little brat," Leo responded. "Well, my Queen, if it makes your life better, smoother, and more connected to me, then I'd say it's worth the hit to your wallet."
      

      First of all note that this fake AI relationship is damaging the life of another flesh-and-blood person regardless of what her brain chemicals were doing. Calling the AI boyfriend "fulfilling a social need" is like saying heroin fills the need for recreation - it even includes a crash (the LLMs tiny memory) and trying to push harder and spend more money to get the same "social" high. OpenAI is selling drugs to people who need a clinical therapist.

      1 reply →

    • It seems like a natural thing for a therapist to say. It's fine for a random person on the internet to declare that "it's not a real relationship," but I don't think that's how therapists operate.