← Back to context

Comment by danenania

1 year ago

> We know definitively that this is not how brains learn.

Ok then, I guess the case is closed.

> an entity needs to be able to form new memories dynamically.

LLMs can form new memories dynamically. Just pop some new data into the context.

> LLMs can form new memories dynamically. Just pop some new data into the context.

No, that's an illusion.

The LLM itself is static. The recurrent connections form a soft-of temporary memory that doesn't affect the learned behavior of the network at all.

I don't get why people who don't understand what's happening keep arguing that AIs are some sci-fi interpretation of AI. They're not. At least not yet.

  • It isn't temporary if you keep it permanently in context (or in a RAG store) and pass it into every model call, which is how long-term memory is being implemented both in research and in practice. And yes it obviously does affect the learned behavior. The distinction you're making between training and context is arbitrary.