Comment by SomeCallMeTim

1 year ago

No.

We know how current deep learning neural networks are trained.

We know definitively that this is not how brains learn.

Understanding requires learning. Dynamic learning. In order to experience something, an entity needs to be able to form new memories dynamically.

This does not happen anywhere in current tech. It's faked in some cases, but no, it doesn't really happen.

> We know definitively that this is not how brains learn.

Ok then, I guess the case is closed.

> an entity needs to be able to form new memories dynamically.

LLMs can form new memories dynamically. Just pop some new data into the context.

  • > LLMs can form new memories dynamically. Just pop some new data into the context.

    No, that's an illusion.

    The LLM itself is static. The recurrent connections form a soft-of temporary memory that doesn't affect the learned behavior of the network at all.

    I don't get why people who don't understand what's happening keep arguing that AIs are some sci-fi interpretation of AI. They're not. At least not yet.

    • It isn't temporary if you keep it permanently in context (or in a RAG store) and pass it into every model call, which is how long-term memory is being implemented both in research and in practice. And yes it obviously does affect the learned behavior. The distinction you're making between training and context is arbitrary.

> We know definitively that this is not how brains learn.

So you have mechanistic, formal model of how the brain functions? That's news to me.

  • Your brain was first trained by reading all of the Internet?

    Anyway, the question of whether computers can think is as interesting as the question whether submarines can swim.

    • > Anyway, the question of whether computers can think is as interesting as the question whether submarines can swim.

      Given the amount of ink spilled on the question, gotta disagree with you there.

      4 replies →

  • There's no way brains have the "right answers" fed into them as required by backpropagation.

    • Look up predictive coding. Our senses are constantly feeding us corrections to our predictions.