Comment by nlh

14 days ago

I dunno.

I've been reading a lot of "screw 'em" comments re: the deprecation of 4o and I agree there's some serious cases of AI psychosis going on with the people who are hooked, but damn this is pretty cold - these are humans with real feelings and real emotions here. Someone on X put it well (I'm paraphrasing):

OpenAI gave these people an unregulated experimental psychiatric drug in the form of an AI companion, they got people absolutely hooked (for better or for worse), and now OpenAI is taking it away. That's going to cause some distress.

We should all have some empathy for the (very real) pain this is causing, whether it's due to psychosis or otherwise.

And I agree! It's something I touch upon halfway iirc, but their suffering shouldn't be something to laugh at or mock. It's genuinely upsetting to see to be honest.

At the same time though, I don't think it's healthy to let them go on with 4o either (especially since new users can start chatting with it)

When it's AI depreciation, it's inhumane and painful. But when Disney puts a film in their vault, it's a masterstroke in artificial scarcity.

I think we're too attached to media.

I’m not sure “AI psychosis” is even right for many of those users who formed attachments to their “companions”.

Psychosis is a real risk for schizophrenia spectrum disorders, but a lot of those relationships look to be rooted in disordered attachment.

Releasing the weights is an easy and low-cost way for OpenAI to fix this problem.

  • According to the thesis of this article, releasing the weights would be approximately the worst thing OpenAI could do for these people.

    • I kind of agree with GP more than the author here. OpenAI got these people hooked and pulling the plug is potentially more harmful than letting them continue to chat with it until they move on naturally (assuming that they eventually will)

      4 replies →