Comment by doug_durham
4 days ago
Do you have a 200k context window? I don't. Most humans can only keep 6 or 7 things in short term memory. Beyond those 6 or 7 you are pulling data from your latent space, or replacing of the short term slots with new content.
But context windows for LLMs include all the “long term memory” things you’re excluding from humans
Long term memory in an LLM is its weights.
Not really, because humans can form long term memories from conversations, but LLM users aren’t finetuning models after every chat so the model remembers.
4 replies →