Comment by PittleyDunkin
1 year ago
A byte is itself sort of a token. So is a bit. It makes more sense to use more tokenizers in parallel than it does to try and invent an entirely new way of seeing the world.
Anyway humans have to tokenize, too. We don't perceive the world as a continuous blob either.
I would say that "humans have to tokenize" is almost precisely the opposite of how human intelligence works.
We build layered, non-nested gestalts out of real time analog inputs. As a small example, the meaning of a sentence said with the same precise rhythm and intonation can be meaningfully changed by a gesture made while saying it. That can't be tokenized, and that isn't what's happening.
What is a gestalt if not a token (or a token representing collections of other tokens)? It seems more reasonable (to me) to conclude that we have multiple contradictory tokenizers that we select from rather than to reject the concept entirely.
> That can't be tokenized
Oh ye of little imagination.