Comment by ajcp
20 days ago
Not at all. It's not the counter-factual they're generating, it's the "too rare to capture often enough to train a response to" they're generating.
They're implying that without the model having knowledge, even approximate, of a scene to react to, it simply doesn't react at all; it simply "yields" to the situation until it passes. In my experience taking Waymo's almost daily this holds.
I would rather not have the Waymo yield to a tornado, rising flood-waters, or charging elephant...
No comments yet
Contribute on Hacker News ↗