Comment by adjwilley

1 day ago

The bug that was causing the crazy token burn was added on Feb 15. It was claimed to have been fixed on Feb 19 (see https://github.com/openclaw/openclaw/pull/20597 ) but it's unclear to me whether that fix has been rolled out yet or if it completely solved the problem. (see https://github.com/openclaw/openclaw/issues/21785 )

TLDR: the commit broke caching so the entire conversation history was being treated as new input on each call instead of most of the conversation being cached.