Comment by lwhi

14 days ago

What do you think context is, if not 'attention'?

You can create a context that includes info and instructions, but the agent may not pay attention to everything in the context, even if context usage is low.

IMO "Attention" is an abstraction over the result of prompt engineering, the chain reaction of input converging the output (both "thinking" and response).

Context is the information you give the model, attention is what parts it focuses on.

And this is finite in capacity and emergent from the architecture.