← Back to context

Comment by onlyrealcuzzo

1 year ago

> Like, that has nothing to do with their intelligence.

Because they don't have intelligence.

If they did, they could count the letters in strawberry.

People have been over this. If you believe this, you don't understand how LLMs work.

They fundamentally perceive the world in terms of tokens, not "letters".

  • > If you believe this, you don't understand how LLMs work.

    Nor do they understand how intelligence works.

    Humans don't read text a letter at a time. We're capable of deconstructing words into individual letters, but based on the evidence that's essentially a separate "algorithm".

    Multi-model systems could certainly be designed to do that, but just like the human brain, it's unlikely to ever make sense for a text comprehension and generation model to work at the level of individual letters.