Comment by Retric
3 years ago
That’s fair. I suspect we are at least 30 years from a drunk to be legally able to order their self driving car home without issue simply due to cultural inertia.
Assuming we get to that point it will probably be another 20 years after that before non self driving is seen in the same way as driving a motorcycle is today. Aka something that’s not suicidal, but defiantly excessively dangerous.
I'd agree with that estimate. But sheesh, 50 years from now seems like forever.
Two final thoughts...
(1) Maybe within the next 50 years devs can instill into AI, some meaning of death. As it is, I find some comfort knowing my driver realizes the difference between a field and a 40-foot cliff along the coast of Big Sur, and our shared theory of mind regarding the consequence of swerving to avoid something in those situations.
(2) Regarding humans being more tolerant of human error. I think this might be because when a human gets in an accident, there is always the ability to reason that person is different than us: old, tired, drunk, distracted, etc. And both the situation and the cognition are unique to one person. Naturally, we would have done something different to avoid the accident, we reason. If an AI gets into an accident, and we know that same exact AI is driving 10 million cars, including our own, that freaks us out a little.
I think (2) is more cultural than that. People reacted poorly to automatic elevators when they where first introduced and now people just don’t think about them as much more complicated than a light switch. The elevator broke and my light broke have the same feel of an inanimate object not working even though elevators are a lot more complicated.
I'm not sure if "the breaks on the train failed" and "the train's ai suddenly decided to start running in reverse" would get the same reception even if the result was the same.
2 replies →