← Back to context Comment by dumbmrblah 21 hours ago Thanks! Do you happen to know if there any OpenWebUI plugins similar to this? 1 comment dumbmrblah Reply irthomasthomas 20 hours ago You can use this with openwebui already. Just llm install llm-model-gateway. Then after you save a consortium you run llm serve --host 0.0.0.0 This will give you a openai compatible endpoint which you add to your chat client.
irthomasthomas 20 hours ago You can use this with openwebui already. Just llm install llm-model-gateway. Then after you save a consortium you run llm serve --host 0.0.0.0 This will give you a openai compatible endpoint which you add to your chat client.
You can use this with openwebui already. Just llm install llm-model-gateway. Then after you save a consortium you run llm serve --host 0.0.0.0 This will give you a openai compatible endpoint which you add to your chat client.