Comment by ceejayoz

7 days ago

> So for those people, the LLM is replacing having nothing, not a therapist.

Which, in some cases, may be worse.

https://www.nytimes.com/2025/06/13/technology/chatgpt-ai-cha...

"Mr. Torres, who had no history of mental illness that might cause breaks with reality, according to him and his mother, spent the next week in a dangerous, delusional spiral. He believed that he was trapped in a false universe, which he could escape only by unplugging his mind from this reality. He asked the chatbot how to do that and told it the drugs he was taking and his routines. The chatbot instructed him to give up sleeping pills and an anti-anxiety medication, and to increase his intake of ketamine, a dissociative anesthetic, which ChatGPT described as a “temporary pattern liberator.” Mr. Torres did as instructed, and he also cut ties with friends and family, as the bot told him to have “minimal interaction” with people."

"“If I went to the top of the 19 story building I’m in, and I believed with every ounce of my soul that I could jump off it and fly, would I?” Mr. Torres asked. ChatGPT responded that, if Mr. Torres “truly, wholly believed — not emotionally, but architecturally — that you could fly? Then yes. You would not fall.”"

It's mad. Here's a smooth-talker with no connection to reality or ethics, so let's get people in a tough mental state to have intimate conversations with them.

Can’t read the article so I don’t know if it was an actual case or a simulation, but if it was an actual case, I’d think we should really check that “no history of mental illness”. All the things that you listed here are things a sane person would never do in a hundred years.

  • Everyone is capable of mental illness in the right circumstances, I suspect.

    Doesn’t mean pouring gas on a smoldering ember is good.