Comment by georgeburdell
10 days ago
Is there math backing up the “quadratic” statement with LLM input size? At least in the traffic analogy, I imagine it’s exponential, but for small amounts exceeding some critical threshold, a quadratic term is sufficient
Every token has to calculate attention for every previous token, that is that attention takes O(sum_i=0^n i) work, sum_i=0^n i = n(n-1)/2, so that first expression is equivalent to O(n^2).
I'm not sure where you're getting an exponential from.