← Back to context

Comment by jonahbenton

4 days ago

OpenWebUI is what you are looking for from a usability perspective. Supports many models chat.

Last I tried OpenWebUI (A few months ago), it was pretty painful to connect non-OpenAI externally hosted models. There was a workaround that involved installing a 3rd party "function" (or was it a "pipeline"?), but it didn't feel smooth.

Is this easier now? Specifically, I would like to easily connect anthropic models just by plugging in my API key.

  • The trick to this is to run a LiteLLM proxy that has all the connections to whatever you need to connect to and then point Open-WebUI to that.

    I've been using this setup for several months now (over a year?) and it's very effective.

    The proxy also benefits pretty much any other application you have that recognizes an OpenAI-compatible API. (Or even if it doesn't)

  • No, still the same, otoh, it works perfectly fine for Claude, and that is the only one I use. I just wished they would finally add native support for this ...

I tried LibreChat and OpenWebUI, between the two I would recommend OpenWebUI.

It feels a bit less polished but has more functions that run locally and things work better out of the box.

My favorite thing is that I can just type my own questions / requests in markdown so I can get formatting and syntax highlighting.