Comment by AlexVranas

1 day ago

OpenAI is playing games.

When Anthropic says they have red lines, they mean "We refuse to let you use our models for these ends, even if it means losing nearly a billion dollars in business."

When OpenAI says they have red lines, they mean "We are going to let the DoD do whatever the hell they want, but we will shake our fist at them while they do it."

That's why they got the contract. The DoD was clear about what they wanted, and OpenAI wasn't going to get anywhere without agreeing to that. They're about as transparent as Mac from It's Always Sunny in Philadelphia when he's telling everyone he's playing both sides.

"Red lines" does not mean some philosophical line they will not cross.

"Redlines" are edits to a contract, sent by lawyers to the other party they're negotiating with. They show up in Word's Track Changes mode as red strikethrough for deleted content.

They are negotiating the specifics of a contract, and Anthropic's contract was overly limiting to the DoD, whereas OpenAI's was not.

  • That’s not how the term is being used here.

    In this case “red lines” as a term is being used as “lines than can not be crossed”

    Anthropic wanted guardrails on how their tech was used. DOD was saying that wasn’t acceptable.

I am going to stop using ChatGPT immediately.

> but we will shake our fist at them while they do it

Not even that. They are not shaking anything except their booty.

Personally I think OpenAI is intending to infiltrate their political enemy's stronghold and look for ways to leak data to "get Trump" as per usual.

They'll say "oops" and then we'll spend the next few years listening to pointless Congressional hearings.

Why DoD and not DoW?

Isn't it simpler to say that anthropic adopted a values based use approach and openai adopted a legal one?

Or In other words you can get to decide two ways to use a lucrative property:

1. designate it private and draft usage of how you allow to use it, per your value system(as long as values don't violate any laws)

2. In face of competition, give up some values and agree to a legal definition of use that favors you.

  • What does 'a legal approach' mean where there is no rule of law? USA just bombed another country without having a domestic legal basis for that. Can't imagined they're holding back on AI use that is illegal -- even textbook-clear warcrimes (like blowing up shipwrecked people) does not give Hegseth and Trump pause.

    That goes for domestic actions too, happy to arm a paramilitary and set them loose against citizens who are not politically aligned with Trump... the Republican Senate barely even blinks. Hard to imagine they'd care about AI use in mass surveillance, nor AI use in automated anti-personnel weapons. The Senate will be, 'Oh no they unlawfully killed USA citizens, again... Welp, let me check my insider trading gains... yh, seems fine'.