Comment by int_19h
5 days ago
If you treat the human brain as a model, and account for the full complexity of neurons (one neuron != one parameter!) it has several orders of magnitude more parameters than any LLM we've made to date, so it shouldn't come as a surprise.
What is surprising is that our brain, as complex as it is, can train so fast on such a meager energy budget.
You are right, but at the same time the human brain does way more stuff (muscle coordination, smell, touch sensing) and all those others take up at least some budget.
So interesting question, but I'm not convinced it's only a scale issue. Like finished models don't really learn the same way as humans do - we actually change the parameters "at runtime", basically updating the model and learning is not only for the current context.