Comment by woadwarrior01
6 months ago
> This framing feels a bit like the “Dropbox won’t succeed because rsync is easy” thinking.
No this isn't. There are plenty of end user GUI apps that make it far easier than Ollama to download and run local LLMs (disclaimer: I build one of them). That's an entirely different market.
IMO, the intersection between the set of people who use a command line tool, and the set of people who are incapable of running `brew install llama.cpp` (or its Windows or Linux equivalents) is exceedingly small.
I can't install any .app on my fairly locked down work computer, but I can `brew install ollama`.
When I read the llama.cpp repo and see I have to build it, vs ollama where I just have to get it, the choice is already made.
I just want something I can quickly run and use with aider or mess around with. When I need to do real work I just use whatever OpenAI model we have running on Azure PTUs
> I can `brew install ollama`.
Can you `brew install llama.cpp`?
probably, but why would I at this point? it's not on aider's supported list. If I needed to replace ollama for some reason I'd probably go with lms.