← Back to context

Comment by paradite

6 months ago

For starters:

- It doesn't have a website

- It doesn't have a download page, you have to build it yourself

> - It doesn't have a download page, you have to build it yourself

I'd wager that anyone capable enough to run a command line tool like Ollama should also be able to download prebuilt binaries from the llama.cpp releases page[1]. Also, prebuilt binaries are available on things like homebrew[2].

[1]: https://github.com/ggerganov/llama.cpp/releases

[2]: https://formulae.brew.sh/formula/llama.cpp

  • I am very technically inclined and use Ollama (in a VM, but still) because of all the steps and non-obviousness of how to run Llama.cpp. This framing feels a bit like the “Dropbox won’t succeed because rsync is easy” thinking.

    • > This framing feels a bit like the “Dropbox won’t succeed because rsync is easy” thinking.

      No this isn't. There are plenty of end user GUI apps that make it far easier than Ollama to download and run local LLMs (disclaimer: I build one of them). That's an entirely different market.

      IMO, the intersection between the set of people who use a command line tool, and the set of people who are incapable of running `brew install llama.cpp` (or its Windows or Linux equivalents) is exceedingly small.

      3 replies →

  • And you still need to find and download the model files yourself, among other steps, which is intimidating enough to drive away most users, including skilled software engineers. Most people just want it to work and start using it for something else as soon as possible.

    The same reason I use apt install instead of compiling from source. I can definitely do that, but I don't, because it's just a way to get things installed.

  • Ok I was looking at the repo from mobile and missed the releases.

    Still it's not immediate obvious from README that there is an option to download it. There are instructions on how to build it, but not how to download it. Or maybe I'm blind, please correct me.

  • I'm perfectly capable of compiling my own software but why bother if I can curl | sh into ollama.