← Back to context

Comment by antdke

16 days ago

Well, imagine this was controlling a weapon.

“Should I eliminate the target?”

“no”

“Got it! Taking aim and firing now.”

It is completely irresponsible to give an LLM direct access to a system. That was true before and remains true now. And unfortunately, that didn't stop people before and it still won't.

That's why we keep humans in the loop. I've seen stuff like this all the time. It's not unusual thinking text, hence the lack of interestingness

  • The human in the loop here said “no”, though. Not sure where you’d expect another layer of HITL to resolve this.

    • Tool confirmation

      Or in the context of the thread, a human still enters the coords and pulls the trigger

      Ukraine is letting some of their drones make kill decisions autonomously, re: areas of EW effect in dead man's zones

      1 reply →

"Thinking: the user recognizes that it's impossible to guarantee elimination. Therefore, I can fulfill all initial requirements and proceed with striking it."