Comment by davidguetta
2 months ago
I can do it if I write the word once and look at it, which is exactly what a transformer based llm is supposed to do.
2 months ago
I can do it if I write the word once and look at it, which is exactly what a transformer based llm is supposed to do.
It sees tokens not letters like us. And has to recite tokens in reverse order, and their letters in reverse order, over a set of 200K tokens. Token codes are arbitrary numbers associated with word fragments, they convey no letters.
This is true but not as impacting as you think. Ask GPT to rewrite ANY text with spaces between lettres and he will do it. He DOES know how to spell and read letters