Comment by tshaddox
5 days ago
> - What if there are several paths to different kinds of intelligence with their own local maxima, in which the AI can easily get stuck after optimizing itself into the wrong type of intelligence?
I think what's more plausible is that there is general intelligence, and humans have that, and it's general in the same sense that Turing machines are general, meaning that there is no "higher form" of intelligence that has strictly greater capability. Computation speed, memory capacity, etc. can obviously increase, but those are available to biological general intelligences just like they would be available to electronic general intelligences.
I agree that general intelligence is general. But increasing computation speed 1000x could still be something that is available to the machines and not to the humans, simply because electrons are faster than neurons. Also, how specifically would you 1000x increase human memory?
The first way we increased human memory by 1000x was with books. Now it’s mostly with computers.
Electronic AGI might have a small early advantage because it’s probably easier for them to have high-speed interfaces to computing power and memory, but I would be surprised if the innovations required to develop AGI wouldn’t also help us interface our biology with computing power and memory.
In my view this isn’t much less concerning than saying “AGI will have a huge advantage in physical strength because of powerful electric motors, hydraulics, etc.”