← Back to context

Comment by 1024core

1 day ago

Current token depends on _all_ previous tokens, but only indirectly for the ones before the previous one, no?

No, using a large context window (which includes previous tokens) is a critical ingredient for the success of modern LLMs. Which is why you will often see the window size mentioned in discussions of newly released LLMs.

In the auto regressive formulation the previous token is no different from any past token, so no. Historically some token took the shortcut of only directly looking at the past token or some other kind of recursive formulation for intermediate states in generating the past token, but that's not the case in for the theoretical formulation of an autoregressive model that was used, and plenty of past autoregressive models didn't do that, for example with nonlinear autoregressive models.