Comment by imoverclocked
2 months ago
Humans seemed to navigate this just fine, even with all the Waymo road blocks and without extra training. If every unknown requires a software update, this system is doomed to repeat this behavior over and over in the long term.
Humans do dumb stuff like drive their cars into flowing floodwaters and they show no signs of stopping. The Waymo Driver (the name for the hardware and software stack) is getting smarter all the time.
as recently as 3 weeks ago, the Waymo driver was also driving into floodwaters.
https://old.reddit.com/r/SelfDrivingCars/comments/1pem9ep/hm...
Humans do indeed drive into floodwaters like fools, but a critical point that’s often missed when talking about how self-driving cars will make the roads safer: you don’t. Self-driving cars can potentially be safer in general, but not necessarily for you in particular.
Imagine I created a magic bracelet that could reduce bicycling related deaths and injuries from 130,000 a year to 70,000. A great win for humans! The catch is that everyone would need to wear it, even people that do not ride bikes, and those 70,000 deaths and injuries would be randomly distributed among the entire population. Would you wear it?
I don't understand the analogy. No one is being forced to stop driving and take autonomous rides. If I am a better than average driver (debatable), I'm glad to have below average drivers use autonomous vehicles instead.
3 replies →
>seemed to navigate this just fine
From my understanding the reason the Waymos didn't handle this was because humans were breaking traffic rules and going when they shouldn't have been. If most humans navigated it correctly, then waynos would have handled this better.
It's mentioned in the article, the real problem was they kept trying to contact remote support to "verify" the light was out. Leading to a backlog of requests which they couldn't get through fast enough.
This attitude is exactly how the Waymos came to handle the problem so poorly in the first place. The principal Skinner "everyone else else is wrong" bit is just icing on the cake.
Can't just program it to be all "when confused copy others" because it will invariably violate the letter of the law and people will screech. So they pick the legally safe but obviously not effective option, have it behave like a teenager on day 1 of drivers ed and basically freeze up. Of course that most certainly does not scale whatsoever, but it covers their asses so it's what they gotta do.
traffic safety engineers often have influence on the letter of the law. We would all be better off if people followed it (humans are bad judges of the exceptions)
1 reply →
My lived experience with human drivers and outages at intersections is most people get it very wrong. If you're lucky and the lit intersection is 1 lane in each direction, more often than not everything works out well. But any intersection with multiple lanes or especially an intersection that is one primary road and a lower traffic secondary is going to be full of people just flying through as if they were on green the whole time.
The average human driver is much worse than waymo.