Comment by 1dom
3 days ago
I don't understand this move. A frontend desktop application is the opposite of what I and anyone else I know uses Ollama for. It's a local LLM backend. It's been around long enough now that any long term users have found, created and/or adjusted to their own front end interface.
I'm comfy, but some of the cutting edge local LLMs have been a little bit slow to be available recently, maybe this frontend focus is why.
I will now go and look at other options like Ollama that have either been fully UI integrated since the start, or that is committed to just being a headless backend. If any of them seem better, I'll consider switching, I probably should have done this sooner.
I hope this isn't the first step in Ollama dropping the local CLI focus, offering a subscription and becoming a generic LLM interface like so many of these tools seem to converge on.
Rightful worry, and we had the same doubts before we embarked on this. Ollama serves developers, there is no doubt about that. The CLI isn’t getting dropped, in fact, what we’ve learned in building it is having the interface interacting with Ollama is a great way for us to dogfood Ollama while building it.
There are so many choices for having an interface, and as a developer you should have a choice in selecting the UI you want. It will all continue to work with Ollama. Nothing about that changes.
Thanks for the response, appreciated. It confirms my feelings though: there are already so many choices for an interface, why are you - a team of people who built a backend LLM - now spending your time doing front end stuff under the same backend product name?
This is sending a very loud message that your focus is drifting away from why I use your product. If it was drifting away into something new and original that supplements my usage of your product, I could see the value, but like you said: there's already so many choices of good interface. Now you're going to have to play catchup against people whose first choice and genuine passion is LLM frontend UIs.
Sorry! I will still use ollama, and thank you so much for all the time and effort put in. I probably wouldn't have had a fraction of the local LLM fun I've had if it wasn't for ollama, even if my main usage is through openwebui. Ultimately, my personal preference is software that does 1 thing and does it well. Others prefer the opposite: tightly integrated all-bells-and-whistles, and I'm sure those people will appreciate this more than me - do what works for you, it's worked so far:)
> some of the cutting edge local LLMs have been a little bit slow to be available recently
You can pull models directily from hugging face ollama pull hf.co/google/gemma-3-27b-it
I know, I often do that, but it's still not enough. E.g. things like SmolLM3 which required some llama ccp tweaks wouldn't work via guff for the first week after it had been released.
Just checked: https://github.com/ollama/ollama/issues/11340 still open issue.
There are many GUIs for Ollama.
This looks like a version of Ollama that bundles one.
I agree.
I just can't see a user-focused benefit for a backend service provider to start building and bundling their own frontend when there's already a bunch of widely used frontends available.