← Back to context

Comment by famouswaffles

8 days ago

What a monumentally stupid idea it would be to place sufficiently advanced intelligent autonomous machines in charge of stuff and ignore any such concerns, but alas, humanity cannot seem to learn without paying the price first.

Morality is a human concern? Lol, it will become a non-human concern pretty quickly once humans don't have a monopoly on human violence.

>What a monumentally stupid idea it would be to place sufficiently advanced intelligent autonomous machines in charge of stuff and ignore any such concerns, but alas, humanity cannot seem to learn without paying the price first.

The stupid idea would be to "place sufficiently advanced intelligent autonomous machines in charge of stuff and ignore" SAFETY concerns.

The discussion here is moral concerns about potential AI agent "suffering" itself.

  • You cannot get an intelligent being completely aligned with your goals, no matter how much you think such a silly idea is possible. People will use these machines regardless and 'safety' will be wholly ignored.

    Morality is not solely a human concern. You only get to enjoy that viewpoint because only other humans have a monopoly on violence and devastation against humans.

    It's the same with slavery in the states. "Morality is only a concern for the superior race". You think these people didn't think that way? Of course they did. Humans are not moral agents and most will commit the most vile atrocities in the right conditions. What does it take to meet these conditions? History tells us not much.

    Regardless, once 'lesser' beings start getting in on some of that violence and unrest, tunes start to change. A civil war was fought in the states over slavery.

    • >You cannot get an intelligent being completely aligned with your goals, no matter how much you think such a silly idea is possible

      I don't think is possible, and didn't say it is. You're off topic.

      The topic I responded to (on the subthread started by @mrguyorama) is the morality of us people using agents, not about whether agents need to get a morality or whether "an intelligent being can be completely aligned with our goals".

      >It's the same with slavery in the states. "Morality is only a concern for the superior race". You think these people didn't think that way? Of course they did.

      They sure did, but also beside the point. We're talking humans and machines here, not humans vs other humans they deem inferior. And the latter are constructs created by humans. Even if you consider them as having full AGI you can very well not care for the "suffering" of a tool you created.

      2 replies →