Comment by SpicyLemonZest
11 hours ago
I do see a large number of people getting affected by this. Character.AI reportedly has 20 million MAU with an average usage of 75 minutes per day (https://www.wired.com/story/character-ai-ceo-chatbots-entert...), and does not as far as I can tell have any use case other than boundary-degrading roleplay.
Medical data is reported on a substantial lag in the US, so right now we have no idea of the suicide rate last year, but I would falsifiably predict it's going to be elevated because of stories like those of Mr. Gavalas.
If its sole contribution is to help 20 million people find an outlet for boundary play that is not the more common ‘nonconsensual abuse of other human beings’, then that sounds like a win. Of course I’d prefer those people invest in human kink communities, but I can certainly respect their choices not to. Tech has always in part been about meeting needs that some parts of society find awkward (photocopiers enabled Spirkfic, CU-SeeMe reflectors were designed specifically to support exhib-cruising years before the web got webcam support, etc.) While there’s a slim chance that some might normalize it back into real life, they’re much more likely to be raised with boundary abuse as an everyday-normal by their parents (especially here in the U.S.!) than they are likely to be converted to being an abuser unknowingly by a chatbot.
That is not at all what I meant by "boundary" and it's concerning to me that you'd assume it is.
> That is not at all what I meant by "boundary" and it's concerning to me that you'd assume it is.
Your clarification on what you meant is 404 not found in your reply, and your “concerning” insult of my personal character is not appreciated.
2 replies →