Comment by energy123

18 days ago

Modern LLMs are large token models. I believe you can model the world at a sufficient granularity with token sequences. You can pack a lot of information into a sequence of 1 million tokens.

Let's be accurate. LLMs are large text-corpus models. The texts are encoded as tokens, but that's just implementation detail.