← Back to context

Comment by cubefox

2 years ago

These are good points. But it seems likely governments will make sure social trust is maintained when problems arise in that regard. Except if people get addicted to AI assistants as fake friends, and resist restricting them. But insufficient laws can be changed and extended, and even highly addicting things like tobacco have been successfully restricted in the past.

A much more terrifying problem is the misuse of AI by terrorists, or aggressive states and military. Or even the danger of humanity losing control of an advanced AI system which doesn't have our best interest in mind. This could spell our disempowerment, or even our end.

It is not clear what could be done about this problem. Passing laws against it is not a likely solution. The game theory here is simply not in our favor. It rewards the most reckless state actors with wealth and power, until it is too late.

>things like tobacco have been successfully restricted in the past

At least in the US do you have any idea how long that battle took and how directly the tobacco companies lied to everyone and paid off politicians? Tobacco setup the handbook for corporations lying to their users in long expensive battles. If AI turns out to be highly dangerous we'll all be long dead as the corporate lawyers fill draw it out in court for decades.

  • A similar mistake was made by Ezra Klein in the NYT at the end of his opinion piece (1).

    The 'we can regulate this' argument relies on heavy, heavy discounting of the past, often paired with heavy discounting of the future. We did not successfully regulate tobacco; we failed, millions suffered and died from manufactured ignorance. We did not successfully regulate the fossil fuel industry; we again, massively failed.

    But if you, in the present day, sit comfortably, did not personally did not get lung disease, have not endured the past, present, and future harms of fossil fuels — and in fact have benefited — it is easy to be optimistic about regulation.

    1. https://www.nytimes.com/2023/11/22/opinion/openai-sam-altman...

    • In the US, at least, a new generation of congresspeople may not go by that same playbook.

      We’re still dealing with most of the same politicians that were there in the late 80s. The anti-labor, pro-corporatist congress, if you will.

      Maybe it’ll always be the same, but as the congress demographics changes, appeals to history don’t seem as strong.

      6 replies →

    • All the more reason to start trying to regulate it early, before it is even more firmly entrenched in society.

    • I mean smoking prevalence in fact decreases, this is at least a partial success. But other things can't be regulated this "easily". If restricting smoking was hard, and restricting AI that undermines social trust is hard, then there are things that even harder to prevent, much harder.

      1 reply →

> But it seems likely governments will make sure social trust is maintained when problems arise in that regard.

It is a conspiracy theory of course (and therefore wrong, of course), but some people are of the opinion that somewhere within the vast unknown/unknowable mechanisms of government, there are forces that deliberately slice the population up into various opposing mini-ideological camps so as to keep their attention focused on their fellow citizens rather than their government.

It could be simple emergence, or something else, but it is physically possible to wonder what the truth of the matter is, despite it typically not being metaphysically possible.

> The game theory here is simply not in our favor. It rewards the most reckless state actors with wealth and power, until it is too late.

There are many games being played simultaneously: some seen, most not. And there is also the realm of potentiality: actions we could take, but consistently "choose" to not (which could be substantially a consequence of various other "conspiracy theories", like teaching humans to use and process language in a flawed manner, or think in a flawed manner).

  • >”It is a conspiracy theory of course (and therefore wrong, of course), but some people are of the opinion that somewhere within the vast unknown/unknowable mechanisms of government, there are forces that deliberately slice the population up into various”

    What you are talking about is called polling or marketing. It’s structural, not a conspiracy. It’s inherent to most all statistical analysis, and everyone uses statistics.

    • Take the temporal relationship between Occupy Wall Street and the culture war battles (as can be seen on Google trends).

      It is physically possible for a media campaign to be launched to increase the amount of discussion of such topics, which gets the public fired up and arguing with each other, leaving them less attention to pay attention to things like the issues surrounding Occupy Wall Street. Whether this actually happened is an epistemological matter, but it is not polling, and it is not mere marketing, though one could frame it as a kind of marketing: "here are some things to fight over, plebs....are you interested in doing so?".

      And this is just one instance. And, defending a claim of nonexistence is essentially impossible, especially when the matter is largely subjective.

      2 replies →

    • The conspiracy theory is also a form of craziness/utter failure of theory of mind. The same sort which is determined that <current given leisure> is specifically intended distraction from <pet issue that only they really care about>. It was seen across the spectrum. The old bizarrely specific noughties chestnuts like <reality TV/the war in Iraq>.

      Which in some cases results in self-fulfilling prophecies like the "footloose church" which believes that dancing is driving young people away from the church instead of the church's own behavior.

      1 reply →