← Back to context

Comment by jajuuka

10 hours ago

Any mental illness mixed with delusions is likely going to end badly. Whether they think Gemini is alive, a video game is real life or that Bjork loves them without ever talking to or meeting them. While LLM's are interactive and listening to an album isn't I don't think there is a fix to this outside posting a warning after every prompt "I am not a real person, if you have mental issues please contact your doctor of emergency services." Which I think is about as useful as a sign in a casino next to the cash out counter that says if you have a problem call this number.

I'm more inclined to believe that this case is getting amplified in MSM because it fits an agenda. Like the people who got hurt using black market vapes. Boosting those stories and making it seem like an epidemic supports whatever message they want to send. Which usually involves money somewhere.

> I'm more inclined to believe that this case is getting amplified in MSM because it fits an agenda.

I mean tech in general has been negatively covered in the media since 2015 due to latent agendas of (a) supposed revenue loss due to existence of Google/FB etc and (b) to align neutral moderation stances towards a preferred viewpoint most suitable to the political party in question.

There is a solution, however, anyone hoping to roleplay with models submits an identity verification, an escrow amount, and a recorded statement acknowledging their risky use of the model. However, I assume the market for this is not insignificant, and therefore, companies hope to avoid putting in such requirements. OpenAI has been moving in that direction as seen during the 4o debacle.

  • But how would your solution have helped in this case?

    The guy was probably a paying user, so Google would have already known who he is. He's also 36, so no excluding him based on age. And neither the escrow nor the statement really add much in my view