Comment by akersten

2 days ago

We wouldn't have roads at all is my point, because no contractor in their right mind would take on unbounded risk for limited gain.

Which in case of digital replicas that can feign real people, may be worth considering. Not a blanket legislation as proposed here, but something that signals the downstream risks to the developer to prevent undesired uses.

  • Then only foreign developers will be able to work with these kinds of technologies... the tools will still be made, they'll just be made by those outside jurisdiction.

  • Unless they released a model named "Tom Cruise-inator 3000," I don't see any way to legislate that intent that would provide any assurances to a developer that their misused model couldn't result in them facing significant legal peril. So anything in this ballpark has a huge chilling effect in my view. I think it's far too early in the AI game to even be putting pen to paper on new laws (the first AI bubble hasn't even popped, after all) but I understand that view is not universal.

    • I would say a text-based model carries a different risk profile compared to video-based ones. At some point (now?) we'd probably need to have the difficult conversation of what level of media-impersonation we are comfortable with.

      2 replies →

Selling anything takes on unbounded risk for limited gain. That’s why the limited liability company exists.

Risk becomes bound to the total value of the company and you can start acting rationally.

  • Historically it's the other way around - limited liability for corporations let juries feel free to award absurdly high judgments against them.