Comment by ToucanLoucan
5 hours ago
> If you’re mentally ill enough that your cause of death is “LLM suicide”, then clearly you need a LOT of help.
NO. SHIT. You know what didn't help one damn bit? Gemeni didn't. It gave him a hopeful way out at the end of a rope and he took it, because he was in too dark of a place to think right.
> Should gemini have terminated the conversation after suggesting the hotline?
That would be the BARE FUCKING MINIMUM! Not only should it NOT engage with and encourage his delusions, it should stop talking to him altogether, and arguably Google should have moderators reporting these people to relevant authorities for wellness checks and interventions!
As I said I don’t work for an AI company and have zero skin in the game. Idk who you’re yelling at to be honest. I guess you’re fired up and emotional. If your goal is to convince others, communicating with an “outrage” tone is unlikely to sway anyone’s opinion (imo).
> it should stop talking to him altogether, and arguably Google should have moderators reporting these people to relevant authorities for wellness checks and interventions
I agree. This seems very reasonable and I would welcome regulations in this area.
The gray area imo is when local LLMs become “good enough” for your average joe to run on their laptop. Who bears responsibility then? Should Ollama (and similar tools) be banned? Where is the line drawn.