Comment by woadwarrior01
6 months ago
> - It doesn't have a download page, you have to build it yourself
I'd wager that anyone capable enough to run a command line tool like Ollama should also be able to download prebuilt binaries from the llama.cpp releases page[1]. Also, prebuilt binaries are available on things like homebrew[2].
I am very technically inclined and use Ollama (in a VM, but still) because of all the steps and non-obviousness of how to run Llama.cpp. This framing feels a bit like the “Dropbox won’t succeed because rsync is easy” thinking.
> This framing feels a bit like the “Dropbox won’t succeed because rsync is easy” thinking.
No this isn't. There are plenty of end user GUI apps that make it far easier than Ollama to download and run local LLMs (disclaimer: I build one of them). That's an entirely different market.
IMO, the intersection between the set of people who use a command line tool, and the set of people who are incapable of running `brew install llama.cpp` (or its Windows or Linux equivalents) is exceedingly small.
I can't install any .app on my fairly locked down work computer, but I can `brew install ollama`.
When I read the llama.cpp repo and see I have to build it, vs ollama where I just have to get it, the choice is already made.
I just want something I can quickly run and use with aider or mess around with. When I need to do real work I just use whatever OpenAI model we have running on Azure PTUs
2 replies →
And you still need to find and download the model files yourself, among other steps, which is intimidating enough to drive away most users, including skilled software engineers. Most people just want it to work and start using it for something else as soon as possible.
The same reason I use apt install instead of compiling from source. I can definitely do that, but I don't, because it's just a way to get things installed.
Ok I was looking at the repo from mobile and missed the releases.
Still it's not immediate obvious from README that there is an option to download it. There are instructions on how to build it, but not how to download it. Or maybe I'm blind, please correct me.
I'm perfectly capable of compiling my own software but why bother if I can curl | sh into ollama.