Comment by dustfinger
2 years ago
> Superintelligence is within reach.
Don't we need to achieve general artificial intelligence first? Or is this "super" intelligence just that? To me, super intelligence would be beyond our own and therefore beyond general AI as well.
Once you have AGI, you can dedicate more cycles to it and the theory is (as far as I can tell) that you’ll get an exponential increase in “intelligence” which is basically just “thinking faster”. Personally, I don’t buy that fully, I think there are some problems that can only be solved by any intelligence at the speed of time in the real world. Certainly more cycles to dedicate to mathematics would advance that field more rapidly, but for a lot of biological puzzles you probably can’t accelerate time like this.
> Once you have AGI, you can...
So if Superintelligence depends on AGI, then we can't say:
> Superintelligence is within reach.
until we can say AGI is within reach. Is there any substantive evidence supporting the revised claim? Do we even have a reliable way to evaluate if an AI is a AGI? The standard answer was the Turing Test, but in recent years there have been multiple claims that it has been broken by non-AGI [1].
To me, AGI still feels a long ways off.
[1]:
- https://www.nature.com/articles/d41586-023-02361-7
- https://www.washingtonpost.com/technology/2022/06/17/google-...
> So if Superintelligence depends on AGI,
Not sure, is superintelligence generic, alive. Does it have motivation, goals? Sounds like a big calculator which he personally will control. Technically it's possible. LLMs aren't the end of AI evolution.
> To me, AGI still feels a long ways off.
It depends on how you define it. Just a couple of years back current LLMs' level was like from sci-fi. But here we are and progress is accelerating. I wonder how many labs and individuals are working on AGI trying to avoid any attention. It feels like any time now. May be something subhuman first.
Yea I’m saying the same thing, you need AGI to get whatever “superintelligence” is. That’s mostly my opinion, but I wouldn’t call it SI without AGI. Personally, I’m no AI expert, but we’re very far from AGI unless one of these experts knows things I don’t about current research. LLMs are interesting, same for other large models like vision, but they don’t impress me with their abilities yet. I’m be impressed when they can replace someone other than content farmers.