Comment by roenxi
6 days ago
If there is a gap in intelligence between two humans, does that mean to you that one of them is necessarily not a general intelligence? The current crop of AIs get some of the questions right by reasoning through them. That means they are already intelligent in the way ARC-AGI-2 measures intelligence. They just aren't very capable ones.
If AI at least equal humans in all intellectual fields then they are super-intelligences, because there are already fields where they dominate humans so outrageously there isn't a competition (nearly all fields, these days). Before they are superintelligences there is a phase where they are just AGIs, we've been in that phase for a while now. Artificial superintelligence is very exciting, but Artificial non-super Intelligence or AGI is here with us in the present.
You can define AGI however you want I suppose, but I would consider it achieved when AI can achieve at least about median human performance on all cognitive tasks. Obviously computers are useful well before this point, but it is clearly meaningful line in the sand, useful enough to merit having a dedicated name like "AGI". Constructed tasks like ARC-AGI simply quantify what everyone can already see, which is that current models cannot be used as a drop-in replacement for humans in most cases.
To me, superintelligence means specifically either dominating us in our highest intellectual accomplishments, i.e. math, science, philosophy or literally dominating us via subordinating or eliminating humans. Neither of these things have happened at all.
> but I would consider it achieved when AI can achieve at least about median human performance on all cognitive tasks
What do you consider below-median humans? Are they meat-zombies? General intelligence is at least somewhere near the minimum of human performance - and it wouldn't be a surprise to me if people performing at that level can't do the ARC AGI test either.