← Back to context

Comment by gortok

21 hours ago

For me, the policy question I want answered is if this was a human driver we would have a clear person to sue for liability and damages. For a computer, who is ultimately responsible in a situation where suing for compensation happens? Is it the company? An officer in the company? This creates a situation where a company can afford to bury litigants in costs to even sue, whereas a private driver would lean on their insurance.

So you're worried that instead of facing off against an insurance agency, the plantiff would be facing off against a private company? Doesn't seem like a huge difference to me

Is there actually any difference? I'd have though that the self-driving car would need to be insured to be allowed on the road, so in both cases you're going up against the insurance company rather than the actual owner.

Waymo hits you -> you seek relief from Waymo's insurance company. Waymo's insurance premium go up. Waymo can weather a LOT of that. Business is still good. Thus, poor financial feedback loop. No real skin in the game.

John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

NOW ... add criminal fault due to driving decision or state of vehicle ... John goes to jail. Waymo? Still making money in the large. I'd like to see more skin in their game.

  • > John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

    John probably (at least where I live) does not have insurance, maybe I could sue him, but he has no assets to speak of (especially if he is living out of his car), so I'm just going to pay a bunch of legal fees for nothing. He doesn't car, because he has no skin in the game. The state doesn't care, they aren't going to throw him in jail or even take away his license (if he has one), they aren't going to even impound his car.

    Honestly, I'd much rather be hit by a Waymo than John.

    • I see. Thank you for sharing. Insurance here is mandatory here for all motorists.

      If you are hit by an underinsured driver, the government steps in and additional underinsured motorist protection (e.g. hit by an out of province/country motorist) is available to all and not expensive.

      Jail time for an at-fault driver here is very uncommon but can be applied if serious injury or death results from a driver's conduct. This is quite conceivable with humans or AI, IMO. Who will face jail time as a human driver would in the same scenario?

      Hit and run, leaving the scene, is also a criminal offence with potential jail time that a human motorist faces. You would hope this is unlikely with AI, but if it happens a small percentage of the time, who at Waymo faces jail as a human driver would?

      I'm talking about edge cases here, not the usual fender bender. But this thread was about policy/regs and that needs to consider crazy edge cases before there are tens of millions of AI drivers on the road.

      3 replies →

  • >John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

    Ah great, so there's a lower chance of that specific John Smith hitting me again in the future!

    • Yes, that is the specific deterrence effect.

      The general deterrence effect we observe in society is that punishment of one person has an effect on others who observe it, making them more cautious and less likely to offend.