← Back to context

Comment by fardo

2 years ago

> Both of these explanations strike me as too clever by half. I think the parsimonious explanation is that people are actually concerned about the dangers of AI

This rings hollow when these companies don’t seem to practice what they preach, and start by setting an example - they don’t halt research and cut the funding for development of their own AIs in-house.

If you believe that there’s X-Risk of AI research, there’s no reason to think it wouldn’t come from your own firm’s labs developing these AIs too.

Continuing development while telling others they need to pause seems to make “I want you to be paused while I blaze ahead” far more parsimonious than “these companies are actually scared about humanity’s future” - they won’t put their money where their mouth is to prove it.

It's a race dynamic. Can you truly imagine any one of them stopping without the others agreeing? How would they tell that the others really have stopped. I think they do believe that it's dangerous what they're doing but that they would rather be the ones to build it than let somebody else get there first because who knows what they'll do.

It's all a matter of incentives and people can easily act recklessly given the right ones. They keep going because they just can't stop.

The best way to not get nuked is to develop nukes first. That's the gist of their usual rebuttal to this argument.

  • Except the argument, projected to the dimension of WMDs, is not that AI is like nukes - rather, AI is like bioweapons. Nukes are dangerous when someone is willing to drop them at someone else. Bioweapons are inherently dangerous - the more you refine them, the worse it gets; eventually, you may build one so deadly that one careless handling mistake ends the world.

  • If those CEOs really thought AI was as bad as nukes they would actually dissolve their companies, destroy all their data, and go churn butter with the Amish instead. The US, having developed nukes first, now has the most nuclear warheads pointed at it.

  • That argument doesn't hold water when they also argue the mere existence of nukes is dangerous. I would love to hear when Hinton had this revelation when his life's work was to advance AI.

  • Apart from Japan, I'd say America is the country that has historically come closest to being nuked, with the Soviet Union a close second.