Comment by MBCook
3 days ago
Does it matter?
Once you’re on public roads, you need to ALWAYS fail-safe. And that means not blocking the road/intersections when something unexpected happens.
If you can physically get out of the way, you need to. Period.
> Does it matter
Yes. OP is inferring Waymo's internal processes from this meltdown. ("Makes me think there are likely other obvious use cases they haven’t thought about proactively either.")
If Waymo literally didn't foresee a blackout, that's a systemic problem. If, on the other hand, there was some weird power and cellular meltdown that coïncided with something else, that's a fixable edge case.
> > Does it matter
> Yes. OP is inferring Waymo's internal processes from this meltdown. ("Makes me think there are likely other obvious use cases they haven’t thought about proactively either.")
No, I'm not inferring internal processes.
I'm guessing level of critical thinking.
When you are creating autonomous vehicles, one of the things that you want to risk assess and have mitigation for is what you want the vehicles to do in case the systems they depend on fail (e.g. electricity, comms).
Now, it could be that the team has anticipated those things but some other failure in their systems have caused vehicles to stop in the middle of intersections, blocking traffic (as per article).
I'm super curious to learn more about what Waymo encountered and how they plan to up their game.
> I'm not inferring internal processes…I'm guessing level of critical thinking
Genuine question: how do these differ? Isn’t the level of critical thinking of Waymo’s employees internal to it? (What’s the mens rea analogue for a company?)
2 replies →
The "coinciding problems" should be an assumption, not a edge case we reason away. Because black swan events are always going to have cascading issues - a big earthquake means lights out AND cell towers overloaded or out, not to mention debris in streets, etc.
What they need is a "shit is fucked fallback" that cedes control. Maybe there is a special bluetooth command any police or ambulance can send if nearby, like clear the intersection/road.
Or maybe the doors just unlock and any human can physically enter and drive the car up to X distance. To techies and lawyers it may sound impossible, but for normal humans, that certainly sounds better. Like that Mitch Hedberg joke, when an escalator is out of order it becomes stairs. When a Waymo breaks it should become a car.
> Or maybe the doors just unlock and any human can physically enter and drive the car up to X distance.
Do the even have physical controls to do that at this point?
I’ve never been in one so I don’t know how different they are from normal cars today.
2 replies →
>If Waymo literally didn't foresee a blackout, that's a systemic problem.
I agree with this bit
> If, on the other hand, there was some weird power and cellular meltdown that coïncided with something else, that's a fixable edge case.
This is what I have a problem with. That’s not an edge case. There will always be a weird thing no one programmed for.
Remember a few years ago when a semi truck overturned somewhere and poured slimy eels all over the highway? No one‘s ever gonna program for that.
It doesn’t matter. There has to be an absolute minimum fail safe that can always work if the car is capable of moving safely. The fact that a human driver couldn’t be reached to press a button to say to execute that is not acceptable. Not having the human available is a totally foreseeable problem. It’s Google. They know networks fail.
This isn't to disagree with your overall point about proper emergency mitigation and having humans available.
> Remember a few years ago when a semi truck overturned somewhere and poured slimy eels all over the highway? No one‘s ever gonna program for that.
While the cause is unusual, this is really just three things that everyone absolutely should be programming into their autonomous vehicles: accidents, road debris, and slick conditions.
1 reply →
A fail-safe is EXACTLY blocking roads at intersections without power, not proceeding through intersections without power. It's much safer to be stopped than to keep going. I honestly wish the humans driving through blacked out intersections without slowing down in my neighborhood last night actually understood this.
It’s not a fail-safe. It’s a different failure mode. Jamming up traffic, including emergency traffic, creates systemic problems.
It’s a bit like designing an electronic lock that can’t be opened if the power goes out. If your recourse to exiting a dangerous situation becomes breaking the door, then the lock is unsafe.
Fail-safe means "in a situation where the function fails, fail in a way that doesn't cause injury" -> the cars didn't know how to proceed, so they stopped, with their lights on, in a way that any attentive driver could safely navigate... which is a failing safe.
The alternative here, is a protocol that obviously hasn't been tested. How on earth are you going to test a Waymo in blackout conditions? I would rather have them just stop, than hope they navigate those untested conditions with vulnerable pedestrians and vehicles acting unpredictable.
9 replies →
An intersection without power is just a 4-way stop.
An intersection without power is supposed to be treated as a 4-way stop. An unfortunately high, nontrivial number of drivers last night were not following that rule.
4 replies →
> Once you’re on public roads, you need to ALWAYS fail-safe.
Yes.
> And that means not blocking the road/intersections when something unexpected happens.
No. Fail-operational is not the only allowable fail-safe condition for automobiles. For example, it is acceptable for loss of propulsion to cause stop-in-lane — the alternative would be to require high-availability propulsion systems, or to require drivers to always have enough kinetic energy to coast to side. This just isn’t the case.
One can argue that when operating a fleet with correlated failure modes the rules should change a bit, but that’s a separate topic.