← Back to context

Comment by fragmede

7 days ago

No, it does get that inappropriate when talked to that much.

https://futurism.com/commitment-jail-chatgpt-psychosis

Post hoc ergo propter hoc. Just because a man had a psychotic episode after using an AI does not mean he had a psychotic episode because of the AI. Without knowing more than what the article tells us, chances are these men had the building blocks for a psychotic episode laid out for him before he ever took up the keyboard.

  • Lots of usage of "no prior history"

    > Her husband, she said, had no prior history of mania, delusion, or psychosis.

    > Speaking to Futurism, a different man recounted his whirlwind ten-day descent into AI-fueled delusion, which ended with a full breakdown and multi-day stay in a mental care facility. He turned to ChatGPT for help at work; he'd started a new, high-stress job, and was hoping the chatbot could expedite some administrative tasks. Despite being in his early 40s with no prior history of mental illness, he soon found himself absorbed in dizzying, paranoid delusions of grandeur, believing that the world was under threat and it was up to him to save it.

    https://archive.is/WIqEr

    > Mr. Torres, who had no history of mental illness that might cause breaks with reality, according to him and his mother, spent the next week in a dangerous, delusional spiral. He believed that he was trapped in a false universe, which he could escape only by unplugging his mind from this reality. He asked the chatbot how to do that and told it the drugs he was taking and his routines. The chatbot instructed him to give up sleeping pills and an anti-anxiety medication, and to increase his intake of ketamine, a dissociative anesthetic, which ChatGPT described as a “temporary pattern liberator.” Mr. Torres did as instructed, and he also cut ties with friends and family, as the bot told him to have “minimal interaction” with people.

    • > Lots of usage of "no prior history"

      ... which ironically sounds a lot like the family is trying to discourage the idea that there were, in fact, signs that just were not taken seriously. No-one wants to admit their family member was mentally ill, and all too often it is easy to hide and ignore. A certain melancholy maybe or an unusually eager response to bombastic music to get motivation? Signs are subtle, and often more obvious in hindsight.

      Then the LLM episode happens, he goes fully haywire, and the LLM makes an easy scapegoat for all kind of things (from stress at work to childhood trauma to domestic abuse).

      Now, if this was a medical paper, I would give 'no prior history' some credibility. But it's not - it's a journalistic document, and I have learned that they tend to use words as these to distract not to enlighten.

      1 reply →

  • Invoking post hoc ergo propter hoc is a textbook way to dismiss an inconvenience to the LLM industrial complex.

    LLMs will tell users, "good, you're seeing the cracks", "you're right", the "fact you are calling it out means you are operating at a higher level of self awareness than most" (https://x.com/nearcyan/status/1916603586802597918).

    Enabling the user in this way is not a passive variable. It is an active agent that validated paranoid ideation, reframed a break from reality as a virtue, and provided authoritative confirmation using all prior context about the user. LLMs are a bespoke engine for amplifying cognitive distortion, and to suggest their role is coincidental is to ignore the mechanism of action right in front of you.

    • Maybe.

      Or maybe it is just the current moral panic.

      Remember when "killer games" were sure to urn a whole generation of young men into mindless cop- and women-murderers a la GTA? People were absolutely convinced there was a clear connection between the two - after all, a computer telling you to kill a human-adjacent figurine in a game was basically a murder simulator in the same sense a flight simulator was for flying - it would invariably desensitivize the youth. Of course they were the same people who were against gaming to begin with.

      Can a person with a tendency to psychosis be influenced by an LLM? Sure. But they also can be influenced to do pretty outrageous things by religion, 'spiritual healers', substances, or bad therapists. Throwing out the LLM with the bathwater is a bit premature. Maybe we need warning stickers ("Do not listen to the machine if you have a history of delusions and/or psychotic episodes.")