From my understanding of Simon's project it only supports OpenAI and OpenAI-compatible models in addition to local model support. For example, if I wanted to use a model on Amazon Bedrock I'd have to first deploy (and manage) a gateway/proxy layer[1] to make it OpenAI-compatible.
Mozzila's project boosts of a lot of existing interfaces already, much like LiteLLM, which has the benefit of directly being able to use a wider range or supported models.
> No Proxy or Gateway server required so you don't need to deal with setting up any other service to talk to whichever LLM provider you need.
Now how it compares to LiteLLM, I don't have enough experience in either to tell.
From my understanding of Simon's project it only supports OpenAI and OpenAI-compatible models in addition to local model support. For example, if I wanted to use a model on Amazon Bedrock I'd have to first deploy (and manage) a gateway/proxy layer[1] to make it OpenAI-compatible.
Mozzila's project boosts of a lot of existing interfaces already, much like LiteLLM, which has the benefit of directly being able to use a wider range or supported models.
> No Proxy or Gateway server required so you don't need to deal with setting up any other service to talk to whichever LLM provider you need.
Now how it compares to LiteLLM, I don't have enough experience in either to tell.
[1] https://github.com/aws-samples/bedrock-access-gateway
Not true is use it with gemini https://llm.datasette.io/en/stable/plugins/directory.html#re...
Plugins! I completely missed that when testing this earlier. Thank you, will have to take another look at it.