Comment by autoexec
4 hours ago
Gemini didn't "know" he wasn't a child when it told him to kill himself or to "stage a mass casualty attack while armed with knives and tactical gear."
There are things you shouldn't encourage people of any age to do. If a human telling him these things would be found liable then google should be. If a human would get time behind bars for it, at least one person at google needs to spend time behind bars for this.
> If a human telling him these things would be found liable then google should be.
Sounds like a big if, actually. Can a human be found liable for this? I’d imagine they might be liable for damages in a civil suit, but I’m not even sure about that.
>Can a human be found liable for this?
A father in Georgia was just convicted of second degree murder, child cruelty, and other charges because he failed to prevent his kid from shooting up his school.
More accurately it was because the father had multiple warnings that his child was mentally unstable but ignored them and handed his 14 year old a semiautomatic rifle even as the boy's mother (who did not live with them) pleaded to the father to lock all the guns and ammo up to prevent the kid from shooting people.
If he had only "failed to prevent his kid from shooting up a school" he wouldn't have even been charged with anything.
5 replies →
https://www.nbcnews.com/news/us-news/michelle-carter-found-g...
>Can a human be found liable for this? I’d imagine they might be liable for damages in a civil suit
it is generally frowned upon (legally) to encourage someone to suicide. i believe both canada and the united states have sent people to big boy prison (for many years) for it
Yes, people have gone to prison for it.
It's been found so in US court previously: https://www.abc.net.au/news/2019-02-08/conviction-upheld-for...
Preferably the C-Suite.
I understand the impulse in this direction, but I’m not sure it would serve as much of a disincentive, as there would likely just be a highly-paid scapegoat. Why not something more lasting and less difficult to ignore, like compulsory disclosure of the model’s source code (in addition to compensation for the victim(s)). Compulsory disclosure of the source would be a massive disadvantage.
exactly. That's why they get the big bucks. They're ultimately responsible
It sounds more poetic than an invitation or an insult that invites someone directly or not to kill themselves, in its own, in my opinion.
This isn't Gemini's words, it's many people's words in different contexts.
It's a tragedy. Finding one to blame will be of no help at all.
> It's a tragedy. Finding one to blame will be of no help at all.
Agreed with the first part, but holding the designers of those products responsible for the death they've incited will help making sure they put more safeguards around this (and I'm not talking about additional warnings)
None of what Gemini says is "Gemini's words". It's always just training data and prompt input remixed and regurgitated out.