← Back to context

Comment by kmacdough

2 years ago

I think a harmful AI simply emerges from asking an AI to optimize for some set of seemingly reasonable business goals, only to find it does great harm in the process. Most companies would then enable such behavior by hiding the damage from the press to protect investors rather than temporarily suspending business and admitting the issue.

Not only will they hide it, they will own it when exposed, and lobby to ensure it remains legal to exploit for profit. See oil industry.

Forget AI. We can't even come up with a framework to avoid seemingly reasonable goals doing great harm in the process for people. We often don't have enough information until we try and find out that oops, using a mix of rust and powdered aluminum to try to protect something from extreme heat was a terrible idea.

  • > We can't even come up with a framework to avoid seemingly reasonable goals doing great harm in the process for people.

    I mean it's not like we're trying all that much in a practical sense right?

    Whatever happened to charter cities?

This is well known via the paperclip maximization problem.

  • The relevancy of the paperclip maximization thought experiment seems less straightforward to me now. We have AI that is trained to mimic human behaviour using a large amount of data plus reinforcement learning using a fairly large amount of examples.

    It's not like we're giving the AI a single task and ask it to optimize everything towards that task. Or at least it's not architected for that kind of problem.

    • But you might ask an AI to manage a marketing campaign. Marketing is phenomenally effective and there are loads of subtle ways for marketing to exploit without being obvious from a distance.

      Marketing is already incredibly abusive and that's run by humans who at least try to justify their behavior. And who's deviousness is limited by their creativity and communication skills.

      If any old scumbag can churn out unlimited high quality marketing, it's could become impossible to cut through the noise.