← Back to context

Comment by Dylan16807

3 days ago

The cars already know those are intersections with lights. I'm not talking about that part. Just the basic logic that you don't go through at speed unless there is a green (or yellow) light.

>The cars already know those are intersections with lights.

That's not how any of this works. You can anthropomorphize all you like, but they don't "know" things. They're only able to predictably respond to their training data. A blackout scenario is not in the training data.

  • Even ignoring the observations we can make, the computers have maps programmed in. Yes they do know the locations of intersections, no training necessary.

    And the usual setup of an autonomous car is an object recognition system feeding into a rules system. If the object recognition system says an object is there, and that object is there, that's good enough to call "knowing" for the purpose of talking about what the cars should do.

    Or to phrase things entirely differently: Finding lights is one of the easy parts. It's basically a solved problem. Cutting your speed when there isn't a green or yellow light is table stakes. These cars earn 2 good boy points for that, and lose 30 for blocking the road.

  • >They're only able to predictably respond to their training data. A blackout scenario is not in the training data.

    Is there anyway to read more about this? I'm skeptical that there aren't any human coded traffic laws in the Waymo software stack, and it just infers everything from "training data".

The lights out should be treated as all way red, including pedestrians.

  • Not all way red, that leads to exactly the problem in the story of blocking traffic. Lights out needs to be a stop sign.

    • Yes, it does lead to blocking the traffic but that is the only safe action to do in such an intersection; if an intersection has traffic lights, there's enough traffic that stop&give way is not a viable operation.

      2 replies →