Comment by mirimir

6 years ago

I know ~nothing about AI. But to me, this seems a great summary. And as a one-time developmental biologist, I'm struck by these observations:

> One thing that should be learned from the bitter lesson is the great power of general purpose methods, of methods that continue to scale with increased computation even as the available computation becomes very great. The two methods that seem to scale arbitrarily in this way are search and learning.

> The second general point to be learned from the bitter lesson is that the actual contents of minds are tremendously, irredeemably complex; we should stop trying to find simple ways to think about the contents of minds, such as simple ways to think about space, objects, multiple agents, or symmetries.

From what I know about brain development, "search and learning" are key mechanisms. Plus massive overproduction and selection, which is basically learning. Maybe that's the main takeaway from biology.

I was thinking the same thing. When I "play" with a new language or tool or concept, I try lots of different scenarios (search), until I can reliably predict how the new thing will work (learning).

  • That's pretty much how our brains develop. Neurons are vastly overproduced, during fetal development through the first few years. Ones that make useful connections, and do useful stuff, survive. And the rest die.

    Also, as in evolution, ~random variations occur during neuronal proliferation, so there's also selection on epigenetic differences. The same sort of process occurs in the immune system.

    In this way, organisms can transcend limitations of their genetic sequences. There's learning at levels of both structure and function.