← Back to context

Comment by doctoboggan

4 days ago

Last I tried OpenWebUI (A few months ago), it was pretty painful to connect non-OpenAI externally hosted models. There was a workaround that involved installing a 3rd party "function" (or was it a "pipeline"?), but it didn't feel smooth.

Is this easier now? Specifically, I would like to easily connect anthropic models just by plugging in my API key.

The trick to this is to run a LiteLLM proxy that has all the connections to whatever you need to connect to and then point Open-WebUI to that.

I've been using this setup for several months now (over a year?) and it's very effective.

The proxy also benefits pretty much any other application you have that recognizes an OpenAI-compatible API. (Or even if it doesn't)

No, still the same, otoh, it works perfectly fine for Claude, and that is the only one I use. I just wished they would finally add native support for this ...