Comment by analog8374

10 hours ago

It's a machine. It by definition lacks autonomy.

The act may be circuiticiously arrived at, but still. Somebody has to write and run the program.

That kind of dodges my question.

I’ll repeat it: Is there any time in the future where you believe a machine or set of machines could measurably out perform a human to the degree that they can coerce or overpower them with no human intervention?

  • (Ya sure, because repeating yourself is always so helpful)

    well, leaving the "with no human intervention" part, which is a bit fuzzy.

    Ya sure. AI can already contrive erudite bs arguments at a moment's notice, sell stuff pretty good and shoot guns with great accuracy.

    Do you?

    • Yes I do

      So, given that we agree that there will be superhuman robotic systems; would you disagree that such a system, at scale, would be impossible to overcome for human or group of humans?

      1 reply →