← Back to context

Comment by scoofy

3 days ago

I live in the affected neighborhood. There were hundreds of drivers that did not know how to handle a power outage... it was a minority of drivers, but it was a nontrivial, but nominally large number. I even saw a Muni bus blow through a blacked out intersection. The difference is the Waymos failed in a way that prevented potential injury, whereas the humans who failed, all fail in a way that would create potential injury.

I wish the Waymos handled it better, yes, but I think that the failure state they took is preferable to the alternative.

Locking down the roads creates a lot of potential injuries too.

And "don't blow through an intersection with dead lights" is super easy to program. That's not enough for me to forgive them of all that much misbehavior.

  • > is super easy to program

    What?!? We’re talking about autonomous vehicles here.

    • I wouldn't say "super easy" but if an autonomous vehicle isn't programmed to handle:

        1: streetlight with no lights
        2: streetlight with blinking red
          2.5: streetlight with blinking yellow
      

      Then they are 100% not qualified to be on the road. Those are basic situations and incredibly easy to replicate, simulate, and incorporate into the training data.

      That is to say, they are not edge cases.

      Dealing with other drivers in those settings is much harder to do but that's a different problem and you should be simulating your car in a wide variety of other driver dynamics. From everyone being very nice to everyone being hyper aggressive and the full spectrum in between.

      1 reply →

    • Specifically identifying road signs, traffic lights, and dead traffic lights is a narrow problem that has feasible solutions. To the point where we can reasonably say “yeah, this sub-component basically works perfectly.”

      Compared to the overall self-driving problem which is very much not a super easy problem.

    • The cars already know those are intersections with lights. I'm not talking about that part. Just the basic logic that you don't go through at speed unless there is a green (or yellow) light.

      8 replies →

> The difference is the Waymos failed in a way that prevented potential injury

No one was injured this time but that's a huge assumption on your part

  • I mean, yes, if the Waymo's could safely pull over, or even know how to handle every emergency situation, I think that would be better. I'd say that's a big ask though. Training autonomous vehicles for blackouts, fires, earthquakes, tornadoes, hail storms, landslides, sinkholes, tsunamis, floods, or even just fog is not really feasible given that most humans won't even navigate the properly. I'll keep saying it: I'm glad the cars were set to fail-safely when they encountered a situation they couldn't understand.

    I honestly wish the human drivers blowing through intersections that night would have done the same. It's miracle no one was killed.

    • That's a non-response response

      >I honestly wish the human drivers blowing through intersections that night would have done the same. It's miracle no one was killed.

      A bit deflective, huh?