← Back to context

Comment by jonahx

1 day ago

> As long as there is a gap between AI and human learning, we do not have AGI.

Don't read the statement as a human dunk on LLMs, or even as philosophy.

The gap is important because of its special and devastating economic consequences. When the gap becomes truly zero, all human knowledge work is replaceable. From there, with robots, its a short step to all work is replaceable.

What's worse, the condition is sufficient but not even necessary. Just as planes can fly without flapping, the economy can be destroyed without full AGI.

If you’re concerned about the economic impact, then whether a model is AGI or not doesn’t matter. It really is more of a philosophical thing.

There’s no “gap that becomes truly zero” at which point special consequences happen. By the time we achieve AGI, the lesser forms of AI will likely have replaced a lot of human knowledge labor through the exact “brute-force” methods Chollet is trying to factor out (which is why many people are saying that doing so is unproductive).

AGI is like an event horizon: It does mean something, it is a point in space, but you don’t notice yourself going through it, the curvature smoothly increases through it.

The gap is important because of its special and devastating economic consequences. When the gap becomes truly zero, all human knowledge work is replaceable. From there, with robots, its a short step to all work is replaceable.

I don’t know why statements like this are just taken as gospel fact. There are plenty of economic activities which do not disappear even if an AI can do them.

Here’s one: I support certain artists because I care about their particular life story and have seen them perform live. I don’t care if an AI can replicate their music because the AI didn’t experience life.

Here’s another: positions that have deep experience in certain industries and have valuable networks; or that derive power by being in certain positions. You could build a model that incorporates every single thing the US president, any president, ever said, and it still wouldn’t get you in the position of being president. Many roles are contextual, not knowledge-based.

The idea that AGI replaces all work only makes sense if you’re talking about a world with completely open, free information access. I don’t just mean in the obvious sense; I mean also “inside your head.” AI can only use data it has access to, and it’s never going to have access to everyone’s individual brain everywhere at all times.

So here’s a better prediction: markets will gradually shift to adjust to this, information will become more secretive, and attention-based entertainment economics will become a larger and larger share of the overall economy.

  • Very few artists or aspiring artists make enough money from their art to make a living -- even now, when the average person has a job and at least some disposable money and can support artists. This % will not get higher if we get 1000x more artists, and 1000x less employed people working in the general economy.

    You can't get deep experience in any industry if there's a machine that can do the entry-level work for a fraction of the cost you can. And keep in mind that, by definition, this machine can learn to do everything you can, so it's in a much better position than you to get that deep experience you speak of.

    If we get what's essentially mass-producable brains, and information gets more secretive as you say, if we have say 1000 machines for every person in the economy, they're in a better position than you to produce said valuable secret information.

    • You can't get deep experience in any industry if there's a machine that can do the entry-level work for a fraction of the cost you can.

      As I said, not all types of jobs are set up this way. Pure knowledge ones, sure. But ones dependent on context are not going to have this elimination of entry-level work in the first place.

      and we get 1000 robots for every person in the economy, they're in a better position than you to produce said valuable secret information.

      Again, no, they aren't, because certain types of information are not merely a question of computational power.

      There is this constant assumption that all knowledge is just a math problem to solve, ergo AI will eventually solve it. That isn't how information actually functions in the real world.

  • > AI can only use data it has access to, and it’s never going to have access to everyone’s individual brain everywhere at all times.

    Yeah, but obviously no human can clear that bar either.

    > Here’s another: positions that have deep experience in certain industries and have valuable networks

    What stops an AGI from gaining "deep experience in an industry"? Or forming networks? There's plenty of popular bot accounts across social media already.

  • it's just not binary. today's world is dominated by capitalistic competition and a lot of people earn a living by competing with their labor. If AI + robots can do the labor better, cheaper, faster, most (90%+) of today's jobs are gone without obvious replacement.

  • Crazy how many people have their heads in the sand.

    I'm glad you could think of a couple examples where AI might not replace humans. It's almost an entirely useless point to make.

    The cat is already out of the bag. The information is out there and the models are trained. Even where we stand today will bring massive disruption in time.

    The economy is being propped up by the wealthy few that have money to spend and now their legs are being cut out from under them with this technology. We're in for a reckoning.