← Back to context

Comment by AlbertoGP

2 years ago

Yes, there are interests pushing for regulation using different arguments.

The regulation in the article is about AIs giving assistance on producing weapons of mass destruction and mentions nuclear and biological. Yann LeCun posted this yesterday about the risk of runaway AIs that would decide to kill or enslave humans, but both arguments result in an oligopoly over AI:

> Altman, Hassabis, and Amodei are the ones doing massive corporate lobbying at the moment.

> They are the ones who are attempting to perform a regulatory capture of the AI industry.

> You, Geoff, and Yoshua are giving ammunition to those who are lobbying for a ban on open AI R&D.

> ...

> The alternative, which will *inevitably* happen if open source AI is regulated out of existence, is that a small number of companies from the West Coast of the US and China will control AI platform and hence control people's entire digital diet.

> What does that mean for democracy?

> What does that mean for cultural diversity?

https://twitter.com/ylecun/status/1718670073391378694

I find Lecun’s argument very interesting, and the whole discussion has parallels to the early regulation and debate surrounding cryptography. For those of us who aren’t on twitter and aren’t aware of all the players in this, can you tell us who he’s responding to as well as who “Geoff” and “Yoshua” are?

I feel, when it comes to pushing regulation, governments always start with the maximalist position since it is the hardest to argue against.

- the government must regulate the internet to stop the spread of child pornography

- the government must regulate social media to stop calls for terrorism and genocide

- the government must regulate AI to stop it from developing bio weapons

...etc. It's always easiest to push regulation via these angles, but then that regulation covers 100% of the regulated subject, rather than the 0.01% of the "intended" subject

  • At the risk of sounding pedantic, it's probably worth pointing out that this executive order isn't really regulating AI.

    That's congress' job.

    It's doing some guideline stuff and specifying how it's used internally in the government and by government funded entities.

    We're still free to develop AI any way we chose.