← Back to context

Comment by decide1000

7 days ago

It was fun because it was open. Now it's just another brand seeking dollars.

Ollama at its core will always be open. Not all users have the computer to run models locally, and it is only fair if we provide GPUs that cost us money and let the users who optionally want it to pay for it.

  • I think it’s the logical move to ensure Ollama can continue to fund development. I think you will probably end up having to add more tiers or some way for users to buy more credits/gpu time. See anthropic’s recent move with Claude code due to the usage of a number of 24/7 users.

I’m not throwing the towel on Ollama yet. They do need dollars to operate, but still provide excellent software for running models locally and without paying them a dime.

  • ^ this. As a developer, Ollama has been my go-to for serving offline models. I then use cloudflare tunnels to make them available where I need them.

Although it is open, its really just all code borrowed from llama.cpp.

If you want to see where the actual developers do the actual hard work, go use llama.cpp instead.