← Back to context

Comment by heresie-dabord

15 hours ago

> much broader implications

Setting aside the spectacular metastasis of a lawless kakistocracy that is literally rewriting the facts on record...

Anthropic's leadership has wisely attempted to make it clear that its product is not fit for the US DoD's purpose/objective, which is automated killing at scale.

It would be (is) grossly, historically negligent to operate weapons with LLMs. Anthropic built systems for a thuggocracy that only understands bribery, blackmail, and force.

[flagged]

  • Anthropic isn’t the inventor here, they are a service provider. The government can easily go find a different service provider, or if none of them will allow their service to be used for war, then the government should develop their own tech.

    Saying the government can just nationalize any company purely because they want to use the tech to kill people has pretty big implications and his historically against what this country stands for.

  • >That’s not their call to make. Inventors of technologies that could be used for war have never had the right to deny access to those technologies to the elected civilian government.[1]

    >[1] The government can make you go over to southeast Asia and kill people personally.

    Is this a normative statement? In other words are you simply claiming "the government has men with guns and therefore can force people/companies do whatever they want", or are you claiming that "the government should be able to commandeer civilian resources for whatever it wants"?

    • It’s a descriptive statement about the law. But you’re mischaracterizing the normative principle underlying the law. It’s not based on power, but rather the moral duties incumbent on citizens.

      12 replies →

  • Anthropic can certainly make the call to deny access this way, but then the US govt can choose not to make contracts with Anthropic. So what's the issue?

    • The whole reason this is a story is that the government won't just refuse to contract, it will put the equivalent of soft sanctions on the company because Anthropic refuses to contract.

  • I have seen a lot of your posts on here about political topics, and they are always disingenuous, misleading, and geared towards providing a thin veneer of reasonability over any form of morality.

    > If Congress doesn’t want AI-powered killing machines, they’re the ones who have the right to make that call.

    You have it backwards, and you know it. If Congress wants to invoke natsec concerns to force companies to sell to the federal government, then they have to explicitly say so, and any such legislation and exercise of execute power pursuant thereto would be heavily litigated.

    > The government can make you go over to southeast Asia and kill people personally. It’s totally incompatible with that to say companies should be allowed to veto the use of their technologies in war.

    Yes, it's legal to have drafts, but that's not relevant, and also includes certain exceptions for conscientious objectors. It doesn't matter if its paradoxical or ironic that an individual could be pressed into military service whereas a private company doesn't have to sell stuff to the federal government.

    • > geared towards providing a thin veneer of reasonability over any form of morality

      Arguing “morality” is usually pointless. There’s no need for discussion among people who agree on what’s moral. But where they don’t agree, invoking morality won’t get anyone anywhere.

      It’s more productive to instead explain how certain policies follow from moral principles that we may not agree on, but we can at least acknowledge are broadly held in society.

      > You have it backwards, and you know it. If Congress wants to invoke natsec concerns to force companies to sell to the federal government, then they have to explicitly say so

      Congress did that back in 1950, with the Defense Production Act.