Comment by pseudocomposer
1 day ago
I mostly just use LLMs for autocomplete (not chat), but wouldn’t this be fixed by adding a “delete message” button/context option in LLM chat UIs?
If you delete the last message from the LLM (so now, you sent the last message), it would then generate a new response. (This would be particularly useful with high-temperature/more “randomly” configured LLMs.)
If you delete any other message, it just updates the LLM context for any future responses it sends (the real problem at hand, context cleanup).
I think seeing it work this way would also really help end users who think LLMs are “intelligent” to better understand that it’s just a big, complex autocomplete (and that’s still very useful).
Maybe this is standard already, or used in some LLM UI? If not, consider this comment as putting it in the public domain.
Now that I’m thinking about it, it seems like it might be practical to use “sub-contextual LLMs” to manage the context of your main LLM chat. Basically, if an LLM response in your chat/context is very long, you could ask the “sub-contextual LLM” to shorten/summarize that response, thus trimming down/cleaning the context for your overall conversation. (Also, more simply, an “edit message” button could do the same, just with you, the human, editing the context instead of an LLM…)
This is how Claude’s UI used to work, in practice, where you could edit the context directly.