Comment by measurablefunc
5 hours ago
I'm correct on the technical level as well: https://chatgpt.com/s/t_698293481e308191838b4131c1b605f1
5 hours ago
I'm correct on the technical level as well: https://chatgpt.com/s/t_698293481e308191838b4131c1b605f1
That math is for comparing all n-grams for all n <= N simultaneously, which isn't what was being discussed.
For any fixed n-gram size, the complexity is still O(N^2), same as standard attention.
I was talking about all n-gram comparisons.
Thanks for clarifying. I was hoping to clarify the disconnect between you two, looked like on on "bigrams, trigrams, & so on." It reads idiomatically as enumerating fixed-n cases. Parsing "& so on" as "their simultaneous union" asks quite a bit of those two words. Either way, as ChatGPT showed you and you shared, all-ngram comparison brings us to O(N^3), still several exponents short of N^10 that started this thread.
2 replies →