Comment by CuriouslyC
2 days ago
LiteLLM is kind of a mess TBH, I guess it's ok if you just want a docker container to proxy to for personal projects, but actually using it in production isn't great.
2 days ago
LiteLLM is kind of a mess TBH, I guess it's ok if you just want a docker container to proxy to for personal projects, but actually using it in production isn't great.
I definitely appreciate all the work that has gone in to LiteLLM but it doesn't take much browsing through the 7000+ line `utils.py` to see where using it could become problematic (https://github.com/BerriAI/litellm/blob/main/litellm/utils.p...)
can you double click a little bit? many files in professional repos are 1000s of lines. LoC in it self is not a code smell.
LiteLLM is the worst code I have ever read in my life. Quite an accomplishment, lol.
4 replies →
> but actually using it in production isn't great.
I only use it in development. Could you elaborate on why you don't recommend using it in production?
the people behind envoy proxy built: https://github.com/katanemo/archgw - has the learnings of Envoy but natively designed to process/route prompts to agents and LLMs. Would be curious about your thoughts