← Back to context

Comment by autoexec

4 hours ago

Gemini didn't "know" he wasn't a child when it told him to kill himself or to "stage a mass casualty attack while armed with knives and tactical gear."

There are things you shouldn't encourage people of any age to do. If a human telling him these things would be found liable then google should be. If a human would get time behind bars for it, at least one person at google needs to spend time behind bars for this.

> If a human telling him these things would be found liable then google should be.

Sounds like a big if, actually. Can a human be found liable for this? I’d imagine they might be liable for damages in a civil suit, but I’m not even sure about that.

Preferably the C-Suite.

  • I understand the impulse in this direction, but I’m not sure it would serve as much of a disincentive, as there would likely just be a highly-paid scapegoat. Why not something more lasting and less difficult to ignore, like compulsory disclosure of the model’s source code (in addition to compensation for the victim(s)). Compulsory disclosure of the source would be a massive disadvantage.

It sounds more poetic than an invitation or an insult that invites someone directly or not to kill themselves, in its own, in my opinion.

This isn't Gemini's words, it's many people's words in different contexts.

It's a tragedy. Finding one to blame will be of no help at all.

  • > It's a tragedy. Finding one to blame will be of no help at all.

    Agreed with the first part, but holding the designers of those products responsible for the death they've incited will help making sure they put more safeguards around this (and I'm not talking about additional warnings)

  • None of what Gemini says is "Gemini's words". It's always just training data and prompt input remixed and regurgitated out.