← Back to context

Comment by washadjeffmad

6 months ago

>So where's the non-sketchy, non-for-profit equivalent

llama.cpp, kobold.cpp, oobabooga, llmstudio, etc. There are dozens at this point.

And while many chalk the attachment to ollama up to a "skill issue", that's just venting frustration that all something has to do to win the popularity contest is to repackage and market it as an "app".

I prefer first-party tools, I'm comfortable managing a build environment and calling models using pytorch, and ollama doesn't really cover my use cases, so I'm not it's audience. I still recommend it to people who might want the training wheels while they figure out how not-scary local inference actually is.

> llama.cpp, kobold.cpp, oobabooga

None of these three are remotely as easy to install or use. They could be, but none of them are even trying.

> lmstudio

This is a closed source app with a non-free license from a business not making money. Enshittification is just a matter of when.

  • I would argue that kobold.cpp is even easier to use than Ollama. You click on the link in the README to download an .exe and doubleclick it and select your model file. No command line involved.

    Which part of the user experience did you have problems with when using it?

    • You’re coming at it from a point of knowledge. Read the first sentence of the Ollama website against the first paragraph of kobold’s GitHub. Newcomers don’t have a clue what “running a GGUF model..” means. It’s written by tech folk without an understanding of the audience.

      5 replies →