Comment by windowshopping
10 days ago
But general intelligence has so much more to it than this. It's so overly simplistic to say "outperform on tasks."
General intelligence means perceiving opportunities. It means devising solutions for problems nobody else noticed. It means understanding what's possible and what's valuable just from existing without being told. It means asking questions without prompting, simply for the sake of wondering and learning. It means so many things beyond "if I feed this data input to this function and hit run, can it come up with the correct output matching my expectations"?
Sure, an LLM might pass a series of problem-solving questions, but could it look up and see the motion of stars and realize they implied something about the nature of the world and start to study them, unasked, and deduce the existence of solar systems and galaxies and gravity and all the other things?
I just don't buy it. It's so reductive. They're hoping to skip over all the real understanding and achieve something great without doing the real legwork to understand the true mechanisms of intelligence by just pouring enough processing time into training. It won't work. They're missing integral mechanisms by overfocusing on the one thing they have a handle on. They don't know what they don't know, but worse, they're not trying to find out.
> It means asking questions without prompting, simply for the sake of wondering and learning.
I disagree. What you are describing is one of the possible goals of intelligence, it doesn't define intelligence itself. Many humans are not really interested in wondering and learning, but we call them intelligent.
You totally can tune LLM that it will ask tons of questions to someone who created chat: how are you today? What are you doing right now? What are your hobbies? Etc.