Comment by skeledrew
2 days ago
Because anyone can actually check the code, which means if there's any funny business, someone will come across it eventually and blow it open.
2 days ago
Because anyone can actually check the code, which means if there's any funny business, someone will come across it eventually and blow it open.
There probably wouldn’t be anything funny-looking – it might look like a genuine mistake in implementation that burns 2× or 3× tokens somehow (which, considering OpenClaw is vibe coded in the purest sense of this term, would blend right in).
Regardless, such things would eventually be found. Just as OpenClaw was tasked with finding and improving science repos (though unwelcome), it could - and very likely will - be tasked with improving its own codebase.
The bug that was causing the crazy token burn was added on Feb 15. It was claimed to have been fixed on Feb 19 (see https://github.com/openclaw/openclaw/pull/20597 ) but it's unclear to me whether that fix has been rolled out yet or if it completely solved the problem. (see https://github.com/openclaw/openclaw/issues/21785 )
TLDR: the commit broke caching so the entire conversation history was being treated as new input on each call instead of most of the conversation being cached.