Comment by FloorEgg

3 days ago

Aren't you agreeing with his point?

The process of evolution distilled down all that "humongous" amount to what is most useful. He's basically saying our current ML methods to compress data into intelligence can't compare to billions of years of evolution. Nature is better at compression than ML researchers, by a long shot.

>Aren't you agreeing with his point? ... Nature is better at compression than ML researchers, by a long shot.

What I mean is basically the opposite. Nature not better as in more efficient. It just had a lot more time and scale to do it in an inefficient way. The reason we're learning quickly is that we can leverage that accumulated knowledge, in a manner similar to in-context learning or other multi-step learning (bulk of the training forms abstractions which are then used by the next stage). It's really unlikely we have some magical architecture that is fundamentally better than e.g. transformers or any other architecture at sample efficiency while having bad underlying data. My intuition is there might even be a hard limit to that. Multi-stage bootstrap might be the key, not the architecture.

Same for the social process of knowledge transfer/compression.

Sample efficiency isnt the ability to distill alot of data into good insights. Its the ability to get good insights from less data. Evolution didnt do that it had a lot of samples to get to where it did

  • > Sample efficiency isnt the ability to distill alot of data into good insights

    Are you claiming that I said this? Because I didn't....

    There's two things going on.

    One is compressing lots of data into generalizable intelligence. The other is using generalized intelligence to learn from a small amount of data.

    Billions of years and all the data that goes along with it -> compressed into efficient generalized intelligence -> able to learn quickly with little data

    • "Are you talking past me?"

      on this site, more than likely, and with intent