Comment by Dylan16807

3 days ago

Locking down the roads creates a lot of potential injuries too.

And "don't blow through an intersection with dead lights" is super easy to program. That's not enough for me to forgive them of all that much misbehavior.

> is super easy to program

What?!? We’re talking about autonomous vehicles here.

  • I wouldn't say "super easy" but if an autonomous vehicle isn't programmed to handle:

      1: streetlight with no lights
      2: streetlight with blinking red
        2.5: streetlight with blinking yellow
    

    Then they are 100% not qualified to be on the road. Those are basic situations and incredibly easy to replicate, simulate, and incorporate into the training data.

    That is to say, they are not edge cases.

    Dealing with other drivers in those settings is much harder to do but that's a different problem and you should be simulating your car in a wide variety of other driver dynamics. From everyone being very nice to everyone being hyper aggressive and the full spectrum in between.

    • If you are just arguing that they're not qualified to be on the road, then I agree with you. I've been an autonomous vehicle skeptic for a long time, mainly because in think our automobile transportation system is inherently dangerous. It's going to be a tough sell though, considering that they are already -- generally -- better drivers than a nontrivial number of human beings.

      It's a tough question. The entire reason I'm defending this shortcoming is exactly that I prefer the fail-safe shutdown to any attempt to navigate bizarre, barely conforming to traffic code, blacked out intersections that are inherently dangerous.

  • Specifically identifying road signs, traffic lights, and dead traffic lights is a narrow problem that has feasible solutions. To the point where we can reasonably say “yeah, this sub-component basically works perfectly.”

    Compared to the overall self-driving problem which is very much not a super easy problem.

  • The cars already know those are intersections with lights. I'm not talking about that part. Just the basic logic that you don't go through at speed unless there is a green (or yellow) light.

    • >The cars already know those are intersections with lights.

      That's not how any of this works. You can anthropomorphize all you like, but they don't "know" things. They're only able to predictably respond to their training data. A blackout scenario is not in the training data.

      2 replies →