I wouldn't say "super easy" but if an autonomous vehicle isn't programmed to handle:
1: streetlight with no lights
2: streetlight with blinking red
2.5: streetlight with blinking yellow
Then they are 100% not qualified to be on the road. Those are basic situations and incredibly easy to replicate, simulate, and incorporate into the training data.
That is to say, they are not edge cases.
Dealing with other drivers in those settings is much harder to do but that's a different problem and you should be simulating your car in a wide variety of other driver dynamics. From everyone being very nice to everyone being hyper aggressive and the full spectrum in between.
If you are just arguing that they're not qualified to be on the road, then I agree with you. I've been an autonomous vehicle skeptic for a long time, mainly because in think our automobile transportation system is inherently dangerous. It's going to be a tough sell though, considering that they are already -- generally -- better drivers than a nontrivial number of human beings.
It's a tough question. The entire reason I'm defending this shortcoming is exactly that I prefer the fail-safe shutdown to any attempt to navigate bizarre, barely conforming to traffic code, blacked out intersections that are inherently dangerous.
Specifically identifying road signs, traffic lights, and dead traffic lights is a narrow problem that has feasible solutions. To the point where we can reasonably say “yeah, this sub-component basically works perfectly.”
Compared to the overall self-driving problem which is very much not a super easy problem.
The cars already know those are intersections with lights. I'm not talking about that part. Just the basic logic that you don't go through at speed unless there is a green (or yellow) light.
>The cars already know those are intersections with lights.
That's not how any of this works. You can anthropomorphize all you like, but they don't "know" things. They're only able to predictably respond to their training data. A blackout scenario is not in the training data.
Even ignoring the observations we can make, the computers have maps programmed in. Yes they do know the locations of intersections, no training necessary.
And the usual setup of an autonomous car is an object recognition system feeding into a rules system. If the object recognition system says an object is there, and that object is there, that's good enough to call "knowing" for the purpose of talking about what the cars should do.
Or to phrase things entirely differently: Finding lights is one of the easy parts. It's basically a solved problem. Cutting your speed when there isn't a green or yellow light is table stakes. These cars earn 2 good boy points for that, and lose 30 for blocking the road.
>They're only able to predictably respond to their training data. A blackout scenario is not in the training data.
Is there anyway to read more about this? I'm skeptical that there aren't any human coded traffic laws in the Waymo software stack, and it just infers everything from "training data".
I wouldn't say "super easy" but if an autonomous vehicle isn't programmed to handle:
Then they are 100% not qualified to be on the road. Those are basic situations and incredibly easy to replicate, simulate, and incorporate into the training data.
That is to say, they are not edge cases.
Dealing with other drivers in those settings is much harder to do but that's a different problem and you should be simulating your car in a wide variety of other driver dynamics. From everyone being very nice to everyone being hyper aggressive and the full spectrum in between.
If you are just arguing that they're not qualified to be on the road, then I agree with you. I've been an autonomous vehicle skeptic for a long time, mainly because in think our automobile transportation system is inherently dangerous. It's going to be a tough sell though, considering that they are already -- generally -- better drivers than a nontrivial number of human beings.
It's a tough question. The entire reason I'm defending this shortcoming is exactly that I prefer the fail-safe shutdown to any attempt to navigate bizarre, barely conforming to traffic code, blacked out intersections that are inherently dangerous.
Specifically identifying road signs, traffic lights, and dead traffic lights is a narrow problem that has feasible solutions. To the point where we can reasonably say “yeah, this sub-component basically works perfectly.”
Compared to the overall self-driving problem which is very much not a super easy problem.
The cars already know those are intersections with lights. I'm not talking about that part. Just the basic logic that you don't go through at speed unless there is a green (or yellow) light.
>The cars already know those are intersections with lights.
That's not how any of this works. You can anthropomorphize all you like, but they don't "know" things. They're only able to predictably respond to their training data. A blackout scenario is not in the training data.
Even ignoring the observations we can make, the computers have maps programmed in. Yes they do know the locations of intersections, no training necessary.
And the usual setup of an autonomous car is an object recognition system feeding into a rules system. If the object recognition system says an object is there, and that object is there, that's good enough to call "knowing" for the purpose of talking about what the cars should do.
Or to phrase things entirely differently: Finding lights is one of the easy parts. It's basically a solved problem. Cutting your speed when there isn't a green or yellow light is table stakes. These cars earn 2 good boy points for that, and lose 30 for blocking the road.
>They're only able to predictably respond to their training data. A blackout scenario is not in the training data.
Is there anyway to read more about this? I'm skeptical that there aren't any human coded traffic laws in the Waymo software stack, and it just infers everything from "training data".
The lights out should be treated as all way red, including pedestrians.
Not all way red, that leads to exactly the problem in the story of blocking traffic. Lights out needs to be a stop sign.
3 replies →