← Back to context

Comment by tmountain

2 months ago

I often ask the LLM for a concise summary of the discussion so far—formatted as a prompt. I then edit it appropriately and use it to start a new conversation without the baggage. I have found this to be a very effective technique, but I imagine it will be automated sometime soon.

Cursor tried doing this automatically - it may still if you're not on a large context model like gemini 2.5 pro - but I found the summary was just missing too many details to use out of the box.

Claude Code has a /compact command that summarises the conversation so far to save on context tokens.