← Back to context

Comment by sschueller

2 days ago

Does anyone know a good alternate project that works similarly (share multipple LLMs across a set of users)? LiteLLM has been getting worse and trying to get me to upgrade to a paid version. I also had issues with creating tokens for other users etc.

If you're talking about their proxy offering, I had this exact same issue and switched to Portkey. I just use their free plan and don't care about the logs (I log separately on my own). It's way faster (probably cause their code isn't garbage like the LiteLLM code - they had a 5K+ line Python file with all their important code in it the last time I checked).

Bifrost is the only real alternative I'm aware of https://github.com/maximhq/bifrost

  • Virtual Keys is an Enterprise feature. I am not going to pay for something like this in order to provide my family access to all my models. I can do without cost control (although it would be nice) but I need for users to be able to generate a key and us this key to access all the models I provide.

    • I just deployed it to test it out and this is FALSE. I was able to create Virtual Keys on the free version with no issues.

      Please do a double take on the facts, you might falsely deter people.

    • I don’t believe it is an enterprise feature. I did some testing on Bifrost just last month on a free open source instance and was able to set up virtual keys.

  • We have tried reaching out to their sales multiple times but never get a response.

agentgateway.dev is one I have been working on that is worth a look if you are using the proxy side of LiteLLM. It's open source part of the Linux foundation.