Comment by portaouflop
6 months ago
I can only speak for myself but to me llama.ccp looks kind of hard to use (tbh never tried to use it), whereas ollama was just one cli command away. Also I had no idea that its equivalent, I thought llama.ccp is some experimental tool for hardcore llm cracks, not something that I can teach my for example my non-technical mom to use.
Looking at the repo of llama.ccp it’s still not obvious to me how to use it without digging in - I need to download models from huggingface it seems and configure stuff etc - with ollama I type ollama get or something and it works.
Tbh I don’t just that stuff a lot or even seriously, maybe once per month to try out new local models.
I think having an easy to use quickstart would go a long way for llama.ccp - but maybe it’s not intended for casual (stupid?) users like me…
In my mind, it doesn't help that llama.cpp's name is that of a source file. Intuitively, that name screams "library for further integration," not "tool for end-user use."