Comment by causal
1 year ago
Hinton is way off IMO. Amount of examples needed to teach language to an LLM is many orders of magnitude more than humans require. Not to mention power consumption and inelasticity.
1 year ago
Hinton is way off IMO. Amount of examples needed to teach language to an LLM is many orders of magnitude more than humans require. Not to mention power consumption and inelasticity.
I think that what Hinton is saying is that, in his opinion, if you fed a 1/100th of a human cortex with the amount of data that is used to train llms, you wouldn't get a thing that can speak in 80 different languages about a gigantic number of subjects, but (I'm interpreting here..) about ten of grams of fried, fuming organic matter.
This doesn't mean that an entire human brain doesn't surpass llms in many different ways, only that artificial neural networks appear to be able to absorb and process more information per neuron than we do.