Comment by nzoschke
2 months ago
> we are now implementing fleet-wide updates
That ~1000 drivers on the road are all better trained on what to do in the next power outage is incredible.
There will always be unexpected events and mistakes made on the roads. Continual improvement that is locked in algorithmically across the entire fleet is way better than any individual driver's learning / training / behaviorior changes.
Humans seemed to navigate this just fine, even with all the Waymo road blocks and without extra training. If every unknown requires a software update, this system is doomed to repeat this behavior over and over in the long term.
Humans do dumb stuff like drive their cars into flowing floodwaters and they show no signs of stopping. The Waymo Driver (the name for the hardware and software stack) is getting smarter all the time.
as recently as 3 weeks ago, the Waymo driver was also driving into floodwaters.
https://old.reddit.com/r/SelfDrivingCars/comments/1pem9ep/hm...
Humans do indeed drive into floodwaters like fools, but a critical point that’s often missed when talking about how self-driving cars will make the roads safer: you don’t. Self-driving cars can potentially be safer in general, but not necessarily for you in particular.
Imagine I created a magic bracelet that could reduce bicycling related deaths and injuries from 130,000 a year to 70,000. A great win for humans! The catch is that everyone would need to wear it, even people that do not ride bikes, and those 70,000 deaths and injuries would be randomly distributed among the entire population. Would you wear it?
4 replies →
>seemed to navigate this just fine
From my understanding the reason the Waymos didn't handle this was because humans were breaking traffic rules and going when they shouldn't have been. If most humans navigated it correctly, then waynos would have handled this better.
It's mentioned in the article, the real problem was they kept trying to contact remote support to "verify" the light was out. Leading to a backlog of requests which they couldn't get through fast enough.
This attitude is exactly how the Waymos came to handle the problem so poorly in the first place. The principal Skinner "everyone else else is wrong" bit is just icing on the cake.
Can't just program it to be all "when confused copy others" because it will invariably violate the letter of the law and people will screech. So they pick the legally safe but obviously not effective option, have it behave like a teenager on day 1 of drivers ed and basically freeze up. Of course that most certainly does not scale whatsoever, but it covers their asses so it's what they gotta do.
2 replies →
My lived experience with human drivers and outages at intersections is most people get it very wrong. If you're lucky and the lit intersection is 1 lane in each direction, more often than not everything works out well. But any intersection with multiple lanes or especially an intersection that is one primary road and a lower traffic secondary is going to be full of people just flying through as if they were on green the whole time.
The average human driver is much worse than waymo.