Comment by nilkn

1 day ago

Anthropic specifically called out systems "that take humans out of the loop entirely and automate selecting and engaging targets".

I take that to mean they don't want the military using Claude to decide who to kill. As a hyperbolic yet frankly realistic example, they don't want Claude to make a mistake and direct the military to kill innocent children accidentally identified as narco-terrorists.

At least, that's the most charitable interpretation of everything going on. I suspect they are also worried that the sitting administration wants to use AI to help them execute a full autocratic takeover of the United States, so they're attempting to kill one of the world's most innovative companies to set an example and pressure other AI labs into letting their technology be used for such purposes.

Right. Did the DoW ask for that? Or does Anthropic make a product that does that?

  • Obviously Anthropic does make a product that could do that -- just give Claude classified data and ask it who to target.

    Obviously the military wants to use it for that purpose since they couldn't accept Anthropic's extremely limited terms.

    One can easily and immediately infer the answers to both your questions are yes.

    • The DoW has explicitly said they don’t want this, and what you are describing are not automated kill drones.

      Anthropic’s safeguards already prevent what you are describing, again the thing thar DoW has said they don’t want.

      8 replies →

  • The DoD is explicitly asking for those things, by forcing contract renegotiation towards a contract that is identical in every way, except removing the prohibition on those things.

    If the DoD did not want those things, it would not be forcing a contract renegotiation to include them, at great cost to the government.

    • No, the DoW may be implicitly asking for those things.

      That’s the point I’m trying to make here: Anthropic should just say the unsaid thing here.

      DoW asked for the following thing: $foo. We won’t give that to them.

      2 replies →