← Back to context

Comment by jpgreenall

5 hours ago

Unsettling that the example talks about trajectories in long range projectiles given recent events..

Was there a recent archery incident?

  • OpenAI just took a major US military contract from Anthropic because Anthropic had morals and wouldn't let the US military use Claude to surveil or attack US citizens ...

    ... and OpenAI didn't. The military said (effectively) "we need to be able to use AI illegally against our own citizens", and OpenAI said "we'll help!"