Comment by eli
5 days ago
There are already various proxies to translate between OpenAI-style models (local or otherwise) and an Anthropic endpoint that Claude Code can talk to. Is the advantage here just one less piece of infrastructure to worry about?
siderailing here - but got one that _actually_ works?
in particular i´d like to call claude-models - in openai-schema hosted by a reseller - with some proxy that offers anthropic format to my claude --- but it seems like nothing gets to fully line things up (double-translated tool names for example)
reseller is abacus.ai - tried BerriAI/litellm, musistudio/claude-code-router, ziozzang/claude2openai-proxy, 1rgs/claude-code-proxy, fuergaosi233/claude-code-proxy,
What probably needs to exist is something like `llsed`.
The invocation would be like this
Where the json has something like
So if one call is two, you can call multiple in the pre or post or rearrange things accordingly.
This sounds like the proper separation of concerns here... probably
The pre/post should probably be json-rpc that get lazy loaded.
Writing that now. Let's do this: https://github.com/day50-dev/llsed
Some unsolicited advice: Streaming support is tricky. I'd strip the streaming out when you proxy until everything else is solid.
2 replies →
I've been hacking on this one for a few months now and it works for me https://github.com/elidickinson/claude-code-mux Been optimizing for routing to different models within one session so maybe overkill.
But I'm surprised litellm (and its wrappers) don't work for you and I wonder if there's something wrong with your provider or model. Which model were you using?