← Back to context

Comment by Der_Einzige

1 year ago

Why did the author have to claim that it's not tokenization issues?

This issue, or at least similar ones, absolutely is due to tokenization issues.

Karpathy is right that nearly every modern problem with LLMs is due to tokenization, but if you don't believe him, maybe see this work by gwern: https://gwern.net/gpt-3#bpes or this work by yours truly: https://aclanthology.org/2022.cai-1.2/