← Back to context

Comment by meatmanek

4 hours ago

For the truncated session logs issue, it sounds like it's auto-compacting the context (or just truncating it), because LLMs can only handle a finite amount of context.

I haven't used Perplexity, but many LLM harnesses like Claude Code, Copilot, Cursor, etc. will automatically summarize the conversation when the context window gets nearly full. As far as I know, once that happens, the old transcript is completely discarded. (I could be wrong though.) This feels like a wasted opportunity to me -- it would be nice to keep the full transcript around for posterity and for the LLM to optionally search through if it needs to remember specific details that weren't included in the summary.

I haven't tried it, but I think you could keep the full transcript by running a pre-compact hook (on Claude Code) to save your entire conversation history to a file.

I'm able to copy and paste entire sessions in Grok, GPT, Claude and Gemini. Just not in Perplexity. Again, as I've said elsewhere, try it. I've documented it in video, beyond all refutation. It is what it is, and I'm not in control.

I do appreciate the feedback though.