← Back to context

Comment by danenania

1 year ago

As the comment I replied to very correctly said, we don’t know how the brain produces cognition. So you certainly cannot discard the hypothesis that it works through “parroting” a weighted average of training data just as LLMs are alleged to do.

Considering that LLMs with a much smaller number of neurons than the brain are in many cases producing human-level output, there is some evidence, if circumstantial, that our brains may be doing something similar.

LLMs don't have neurons. That's just marketing lol.

"A neuron in a neural network typically evaluates a sequence of tokens in one go, considering them as a whole input." -- ChatGPT

You could consider an RTX 4090 to be one neuron too.

  • It’s almost as if ‘neuron’ has a different meaning in computer science than biology.

    • LOL you just owned the guy who said "LLMs with a much smaller number of neurons than the brain are in many cases producing human-level output"

> in many cases producing human-level output

They’re not, unless you blindly believe OpenAI press releases and crypto scammer AI hype bros on Twitter.