← Back to context

Comment by coldtea

6 days ago

>Holy fuck, this is Holocaust levels of unethical.

Nope. Morality is a human concern. Even when we're concerned about animal abuse, it's humans that are concerned, on their own chosing to be or not be concern (e.g. not consider eating meat an issue). No reason to extend such courtesy of "suffering" to AI, however advanced.

What a monumentally stupid idea it would be to place sufficiently advanced intelligent autonomous machines in charge of stuff and ignore any such concerns, but alas, humanity cannot seem to learn without paying the price first.

Morality is a human concern? Lol, it will become a non-human concern pretty quickly once humans don't have a monopoly on human violence.

  • >What a monumentally stupid idea it would be to place sufficiently advanced intelligent autonomous machines in charge of stuff and ignore any such concerns, but alas, humanity cannot seem to learn without paying the price first.

    The stupid idea would be to "place sufficiently advanced intelligent autonomous machines in charge of stuff and ignore" SAFETY concerns.

    The discussion here is moral concerns about potential AI agent "suffering" itself.

    • You cannot get an intelligent being completely aligned with your goals, no matter how much you think such a silly idea is possible. People will use these machines regardless and 'safety' will be wholly ignored.

      Morality is not solely a human concern. You only get to enjoy that viewpoint because only other humans have a monopoly on violence and devastation against humans.

      It's the same with slavery in the states. "Morality is only a concern for the superior race". You think these people didn't think that way? Of course they did. Humans are not moral agents and most will commit the most vile atrocities in the right conditions. What does it take to meet these conditions? History tells us not much.

      Regardless, once 'lesser' beings start getting in on some of that violence and unrest, tunes start to change. A civil war was fought in the states over slavery.

      3 replies →

I think the holocaust framing here might have been intended to be historically accurate, rather than a cheap godwin move. The parallel being that during the holocaust people were re-classified as less-than-human.

Currently maybe not -yet- quite a problem. But moltbots are definitely a new kind of thing. We may need intermediate ethics or something (going both ways, mind).

I don't think society has dealt with non-biological agents before. Plenty of biological ones though mind. Hunting dogs, horses, etc. In 21st century ethics we do treat those differently from rocks.

Responsibility should go not just both ways... all ways. 'Operators', bystanders, people the bots interact with (second parties), and the bots themselves too.