← Back to context

Comment by leopoldj

2 days ago

The self hosting section covers corporate use case using vLlm and sglang as well as personal desktop use using Ollama which is a wrapper over llama.cpp.

Recommending Ollama isn't useful for end users, its just a trap in a nice looking wrapper.

  • Strong disagree on this. Ollama is great for moderately technical users who aren't really programmers or proficient with the command line.

    • You can disagree all you want, but Ollama does not keep their llama.cpp vendored copy up to date, and also ships, via their mirror, completely random badly labeled models claiming to be the upstream real ones, often misappropriated from major community members (Unsloth, et al).

      When you get a model offered by Ollama's service, you have no clue what you're getting, and normal people who have no experience aren't even aware of this.

      Ollama is an unrestricted footgun because of this.

      3 replies →