Comment by catoc

17 hours ago

Yes, and most of us won’t break into other people’s houses, yet we really need locks.

This isn't a lock

It's more like a hammer which makes its own independent evaluation of the ethics of every project you seek to use it on, and refuses to work whenever it judges against that – sometimes inscrutably or for obviously poor reasons.

If I use a hammer to bash in someone else's head, I'm the one going to prison, not the hammer or the hammer manufacturer or the hardware store I bought it from. And that's how it should be.

  • Given the increasing use of them as agents rather than simple generators, I suggest a better analogy than "hammer" is "dog".

    Here's some rules about dogs: https://en.wikipedia.org/wiki/Dangerous_Dogs_Act_1991

    • How many people do dogs kill each year, in circumstances nobody would justify?

      How many people do frontier AI models kill each year, in circumstances nobody would justify?

      The Pentagon has already received Claude's help in killing people, but the ethics and legality of those acts are disputed – when a dog kills a three year old, nobody is calling that a good thing or even the lesser evil.

      1 reply →

  • This view is too simplistic. AIs could enable someone with moderate knowledge to create chemical and biological weapons, sabotage firmware, or write highly destructive computer viruses. At least to some extent, uncontrolled AI has the potential to give people all kinds of destructive skills that are normally rare and much more controlled. The analogy with the hammer doesn't really fit.