← Back to context

Comment by macintux

3 days ago

Effectively they’ve turned any edge case into a potential city-wide problem and PR nightmare.

One driver doesn’t know how to handle a power outage? It’s not news. Hundreds of automated vehicles all experience the same failure? National news.

I live in the affected neighborhood. There were hundreds of drivers that did not know how to handle a power outage... it was a minority of drivers, but it was a nontrivial, but nominally large number. I even saw a Muni bus blow through a blacked out intersection. The difference is the Waymos failed in a way that prevented potential injury, whereas the humans who failed, all fail in a way that would create potential injury.

I wish the Waymos handled it better, yes, but I think that the failure state they took is preferable to the alternative.

  • Locking down the roads creates a lot of potential injuries too.

    And "don't blow through an intersection with dead lights" is super easy to program. That's not enough for me to forgive them of all that much misbehavior.

  • > The difference is the Waymos failed in a way that prevented potential injury

    No one was injured this time but that's a huge assumption on your part

    • I mean, yes, if the Waymo's could safely pull over, or even know how to handle every emergency situation, I think that would be better. I'd say that's a big ask though. Training autonomous vehicles for blackouts, fires, earthquakes, tornadoes, hail storms, landslides, sinkholes, tsunamis, floods, or even just fog is not really feasible given that most humans won't even navigate the properly. I'll keep saying it: I'm glad the cars were set to fail-safely when they encountered a situation they couldn't understand.

      I honestly wish the human drivers blowing through intersections that night would have done the same. It's miracle no one was killed.

      1 reply →

Yeah, the correlated risk with AVs is a pretty serious concern. And not just in emergencies where they can easily DDOS the roads, but even things like widespread weaknesses or edge cases in their perception models can cause really weird and disturbing outcomes.

Imagine a model that works real well for detecting cars and adults but routinely misses children; you could end up with cars that are 1/10th as deadly to adults but 2x as deadly to children. Yes, in this hypothetical it saves lives overall, but is it actually a societal good? In some ways yes, in some ways it should never be allowed on any roads at all. It’s one of the reasons aggregated metrics on safety are so important to scrutinize.

Right. You know there are humans somewhere in the city who got confused or scared and mess up too. Maybe a young driver who is barely confident in the first place on a temporary permit, or just someone who doesn’t remember what you do and was already over-stressed.

Whatever, it happens.

This was a (totally unintentional) coordinated screw up causing problems all over as opposed to one small spot.

The scale makes all the difference.

  • Definitely. The question then becomes how do they respond on the stimulus of other, more experienced drivers?

    Eg. if they see 5 cars going around them and "solving" the intersection, do they get empowered to do the same? Or do some annoying honkers behind them make them bite the bullet and try their hand at passing it (and not to worry, other drivers will also make sure no harm comes to anyone even if you make a small mistake)? Human drivers, no matter how inexperienced, will learn on the spot. Self-driving vehicles can "learn" back in the SW department.

    Yes, driving is a collaborative activity which requires that we all partner on finding most efficient patterns of traffic when traffic lights fail. Self-driving cars cannot learn on the spot, and this is the main difference between them and humans: you either have them trained on every situation, or they go into weird failure modes like this.

  • Was it unintentional? These systems were programmed to fall bad into "terrified 16yo/elderly lady" behavior because that's what's most legally defensible.