← Back to context

Comment by SpicyLemonZest

16 hours ago

> These aren't just average, every day random people getting taken out by AIs, they have existing, extreme mental illness.

How do you know that? The concern is precisely that this isn't the case, and LLM roleplay is capable of "hooking" people going through psychologically normal sadness or distress. That's what the family believes happened in this story.

Because you'd see a large number of people getting affected by this. Because this sort of thing is predictable and normal throughout history; it's exactly the type of thing you'd expect to see, knowing the range of mental illnesses people are susceptible to, and how other technology has affected them.

  • I do see a large number of people getting affected by this. Character.AI reportedly has 20 million MAU with an average usage of 75 minutes per day (https://www.wired.com/story/character-ai-ceo-chatbots-entert...), and does not as far as I can tell have any use case other than boundary-degrading roleplay.

    Medical data is reported on a substantial lag in the US, so right now we have no idea of the suicide rate last year, but I would falsifiably predict it's going to be elevated because of stories like those of Mr. Gavalas.

    • If its sole contribution is to help 20 million people find an outlet for boundary play that is not the more common ‘nonconsensual abuse of other human beings’, then that sounds like a win. Of course I’d prefer those people invest in human kink communities, but I can certainly respect their choices not to. Tech has always in part been about meeting needs that some parts of society find awkward (photocopiers enabled Spirkfic, CU-SeeMe reflectors were designed specifically to support exhib-cruising years before the web got webcam support, etc.) While there’s a slim chance that some might normalize it back into real life, they’re much more likely to be raised with boundary abuse as an everyday-normal by their parents (especially here in the U.S.!) than they are likely to be converted to being an abuser unknowingly by a chatbot.

      4 replies →