Comment by irthomasthomas
4 days ago
Yeah, I don't know exactly what at an AGI model will look like, but I think it would have more than 200k context window.
4 days ago
Yeah, I don't know exactly what at an AGI model will look like, but I think it would have more than 200k context window.
Do you have a 200k context window? I don't. Most humans can only keep 6 or 7 things in short term memory. Beyond those 6 or 7 you are pulling data from your latent space, or replacing of the short term slots with new content.
But context windows for LLMs include all the “long term memory” things you’re excluding from humans
Long term memory in an LLM is its weights.
5 replies →
I'm not quite AGI, but I work quite adequately with a much, much smaller memory. Maybe AGI just needs to know how to use other computers and work with storage a bit better.
I'd think it would be able to at least suggest which model to use rather than just having 6 for you to choose from.