Comment by charcircuit

14 hours ago

>and doing that will cause a huge one-time hit against your token limit if the session has grown large.

Anthropic already profited from generating those tokens. They can afford subsidize reloading context.

No they can't, that's what you don't seem to get.

Reloading those tokens takes around the same effort as processing them in the first place.

It's ok to be ignorant of how the infrastructure for LLMs work, just don't be proud of it.