Comment by pron
4 hours ago
I think there's another problem with AI doomerism, which is the belief that superhuman intelligence (even if such a thing could be defined and realised) results in godlike powers. Many if not most systems of interest in the world are non-linear and computationally hard; controlling/predicting them requires pure computational power that no amount of intelligence (whatever it means) can compensate for. On the other hand, dynamics we do (roughly) understand and can predict, don't require much intelligence, either. To the extent some problems are solvable with the computational power we have, some may require data collection and others may require persuasion through charisma. The claim that intelligence is the factor we're lacking is not well supported.
Ascribing a lot of power to intelligence (which doesn't quite correspond to what we see in the world) is less a careful analysis of the power of intelligence and more a projection of personal fantasies by people who believe they are especially intelligent and don't have the power they think they deserve.
Political power is the bottleneck for most shit that matters, not computational power.
Most of the stuff that sucks on the us sucks because of entrenched institutions with perverse interests (health insurers, tax filing companies) and congressional paralysis, not computational bottlenecks. Raw intelligence is thus limited in what it can achieve.
> I think there's another problem with AI doomerism, which is the belief that superhuman intelligence (even if such a thing could be defined and realised) results in godlike powers.
I agree with this. The main piece of evidence to support this is to just look at highly intelligent humans. Folks at the tail ends of the bell curve mostly don't end up with "godlike powers" or anything even approximating that, they are grinding away their life as white collar professionals working in jobs surrounded by far less intelligent peers. They may publish higher quality papers, write better software, or have better outcomes, but they're just working in the same jobs as everyone else. We have no political or economic will to build serious think tanks to work on societal-scale problems, and even if we did, nobody would listen to the outcome.
So let's assume ASI becomes a thing, what does it change?
> Ascribing a lot of power to intelligence (which doesn't quite correspond to what we see in the world)
Which animal would you say has god-like power over all other animals?
I don't think any of them do. Some organisms/viruses or groups of organisms could destroy humans more easily than humans could destroy them.
There's no doubt humans possess some powers (though certainly not godlike) that other organisms don't, but the distinction seems to be binary. E.g. the intelligence of dolphins, apes, and some birds doesn't seem to offer them any special control over other organisms (and it didn't even before humans arrived). So even if there could be such a thing as superhuman intelligence, I don't think it's reasonable to assume it could achieve control over humans (now superhuman charisma may be another matter).
> Some organisms/viruses or groups of organisms could destroy humans more easily than humans could destroy them.
"Destruction" is only one power that could be a component of "godlike power". There are several more; like power of intentional selective breeding, power of species creation (also via intentional selective breeding), etc.
What about power of granting happiness or misery to large swathes of a species (chickens, anyone?)
Do you consider viruses to be animals?
fungus.
Oh, wait, that's not an animal. My bad.
I don't agree with you. Lets assume intelligence is not what ascribes power but probably another thing. In your opinion, what would a superhuman be like? On what dimensions would they be better than us in?
Do you not agree that there could be entities more powerful than us?