Comment by okdood64

14 hours ago

> Waymo will still have to accept some responsibility

Why? This is only true if they weren't supposed to be on the road in the first place. Which is not true.

Think of it like dog ownership: if my dog hurts someone, that's on me. Property that causes harm is the owner's responsibility.

If I program a machine and it goes out into the world and hurts someone who did not voluntarily release my liability, that's on me.

In a technical sense, maybe, but it's all going to be about optics. They have a responsibility to handle the situation well even if it's not their fault, and the public will hold them accountable for what they deem the involvement was, which may not be the actual scenario.

  • > In a technical sense, maybe, but it's all going to be about optics.

    Indeed, it is, and that is exactly why Waymo will have to accept some responsibility. I can bet that internally Waymo's PR and Legal teams are working overtime to coordinate the details with NHTSA. We, the general public, may or may not know the details at all, if ever. However, Waymo's technical teams (Safety, etc) will also be working overtime to figure out what they could have done better.

    As I mentioned, this is a standard test, and Waymo likely has 1000s of variations of this test in their simulation platforms; they will sweep across all possible parameters to make this test tighter, including the MER (minimum expected response from the AV) and perhaps raise the bar on MER (e.g. brake at max deceleration in some cases, trading off comfort metrics in those cases; etc.) and calculate the effects on local traffic (e.g. "did we endanger the rear vehicles by braking too hard? If so, by how much??" etc). All these are expected actions which the general public will never know (except, perhaps via some technical papers).

    Regardless, the PR effects of this collision do not look good, especially as Waymo is expanding their service to other cities (Miami just announced; London by EOY2026). This PR coverage has potential to do more damage to the company than the actual physical damage to the poor traumatized kid and his family. THAT is the responsibility only the company will pay for.

    To be sure, my intuition tells me this is not the last such collision. Expect to see some more, by other companies, as they commercialize their own services. It's a matter of statistics.

Bringing a vehicle onto the public roads is a privilege not a right. Any harm to pedestrians that results is your responsibility, not anyone else's.

The performance of a human is inherently limited by biology, and the road rules are written with this in mind. Machines don't have this inherent limitation, so the rules for machines should be much stronger.

I think there is an argument for incentivising the technology to be pushed to its absolute limits by making the machine 100% liable. It's not to say the accident rate has to be zero in practice, but it has to be so low that any remaining accidents can be economically covered by insurance.

  • At least in the interim, wouldn’t doing what you propose cause more deaths if robot drivers are less harmful than humans, but the rules require stronger than that? (I can see the point in making rules stronger as better options become available, but by that logic, shouldn't we already be moving towards requiring robots and outlawing human drivers if it's safer?)