Comment by lovich

9 hours ago

I didn’t think just offloading your thinking to AI was AI psychosis.

To me AI psychosis is the handful of friends I’ve had who have done things like have a full on mourning session when a model updates because they lost a friend/lover, the one guy who won’t speak to his family directly but has them talk to ChatGPT first and then has ChatGPT generate his response, or the two who are confident that they have discovered that physics and mathematics are incorrect and have discovered the truth of reality through their conversations with the models.

But language is a shared technology so maybe the term is being used for less egregious behavior than I was using it for.

I'm curious how to best define what AI psychosis actually is.

My understanding is that regular psychosis involves someone taking bits and pieces of facts or real world events and chaining them into a logical order or interpolating meanings or explanations which feel real and obvious to the patient but are not sufficiently backed by evidence and thus not in line with our widely accepted understanding of reality.

AI psychosis is then this same phenomenon occurring at a more widespread scale due to the next-word-prediction nature of LLMs facilitating this by lowering the activation energy for this to happen. LLMs are excellent at taking any idea, question, theory and spinning a linear and plausibly coherent line of conversation from it.

  • You speak like a bot and are a brand new account. Thank you for whoever set this up to add to the problem.

> friends I’ve had who have done things like have a full on mourning session when a model updates because they lost a friend/lover

I mean, isn't that the natural and expected response? An AI company sold them a relationship with a chatbot and at least some their social/romantic needs were being met by that product. When what they were paying for was taken from them and changed without warning into something that no longer filled that void in their life why wouldn't they morn that loss?

The fact that they were hurt by that sudden loss is totally healthy. It's just part of moving on. The real problem was getting into an unhealthy relationship with a fictitious partner under the control of an abusive company willing to exploit their loneliness in exchange for money.

Hopefully they now know better, but people (especially desperate ones) make poor choices all the time to get what's missing in their lives or to distract themselves from it.

  • > I mean, isn't that the natural and expected response? An AI company sold them a relationship with a chatbot and at least some their social/romantic needs were being met by that product. When what they were paying for was taken from them and changed without warning into something that no longer filled that void in their life why wouldn't they morn the loss of that?

    Ah, I forgot about the ai relationship companies. No this guy was using the browser based ChatGPT for coding and ended up in love with the model. No relationship was sold at all.

    • Wow, okay. Reading a whole relationship into that sort of interaction is way less reasonable, although now that I think about it a somewhat similar thing happened to Geordi La Forge once...

      2 replies →

How do you have so many crazy friends?

  • I work in software and don’t come from the upper class sending their kids into faangs for their first job at the tender age of 28.

    Were kinda predisposed to mental illness as a group, not too surprised that a new source of insanity pushed a few over the edge.