← Back to context

Comment by samatman

1 year ago

I would say that "humans have to tokenize" is almost precisely the opposite of how human intelligence works.

We build layered, non-nested gestalts out of real time analog inputs. As a small example, the meaning of a sentence said with the same precise rhythm and intonation can be meaningfully changed by a gesture made while saying it. That can't be tokenized, and that isn't what's happening.

What is a gestalt if not a token (or a token representing collections of other tokens)? It seems more reasonable (to me) to conclude that we have multiple contradictory tokenizers that we select from rather than to reject the concept entirely.

> That can't be tokenized

Oh ye of little imagination.