← Back to context

Comment by raldi

3 days ago

I'm surprised that either:

1. Nobody at Waymo thought of this,

2. Somebody did think of it but it wasn't considered important enough to prioritize, or

3. They tried to prep the cars for this and yet they nonetheless failed so badly

Everyone should have understood that driving requires improvisation in the face of uncommon but inevitable bespoke challenges that this generation of AI is not suited for. Either because it's common sense or because so many people have been shouting it for so long.

  • What improvisation is required? A traffic light being out is a standard problem with a standard solution. It's just a four-way stop.

    • In many versions of road rules (I don't know about California), having four vehicles stopped at an intersection without one of the four lanes having priority creates a dining philosophers deadlock, where all four vehicles are giving way to others.

      Human drivers can use hand signals to resolve it, but self-driven vehicles may struggle, especially if all four lanes happens to have a self-driven vehicle arrive. Potentially if all vehicles are coordinated by the same company, they can centrally coordinate out-of-band to avoid the deadlock. It becomes even more complex if there are a mix of cars coordinated by different companies.

  • To be fair 'common sense' and 'many people have been shouting it' about technical matters have a long history of being hilariously wrong. Like claims that trains would cause organ damage to their riders from going at the blistering speed of either 35 or 50 mph, IIRC. Or about manned flight being impossible. Common sense would tell you that launching a bunch of broadcasting precise clocks into orbit wouldn't be usable to determine the distance, and yet here we are with GPS.

  • I'd say driving only requires not to handle uncommon situation dangerously. And stopping when you can't handle something fits my criteria.

    Also I'm not sure it's entirely AI's fault. What do you do when you realistically have to break some rules? Like here, I assume you'd have to cut someone off if you don't want to wait forever. Who's gonna build a car that breaks rules sometimes, and what regulator will approve it?

    • If you are driving a car on a public street and your solution to getting confused is stopping your car in the middle of the road wherever this confusion happens to arise, and sitting there for however long you are confused, you should not be driving a car in the first place. That includes AI cars.

  • But a citywide blackout isn’t that uncommon.

    • > But a citywide blackout isn’t that uncommon.

      I think too many people talk past each other when they use the word common, especially when talking about car trips.

      A blackout (doesn't have to be citywide) may not be periodic but it's certainly frequent with a frequency above 1 per year.

      Many people say "common" meaning "frequent", and many people say "common" meaning "periodic".

      2 replies →

    • It isn't? To me that's the main problem here, as this should be an exceptionally rare occurrence.

Likely 2. Not something that will make it into in their kpis. No one is getting promoted for mitigating black swan events.

  • Actually that is specifically not true at Google, and I expect it applies to Waymo also.

    People get promoted for running DiTR exercises and addressing the issues that are exposed.

    Of course the problem is that you can't DiRT all the various Black Swans.

Clearly cars can navigate themselves, it's the lack of remote ops that halted everything