← Back to context

Comment by A4ET8a8uTh0_v2

17 hours ago

Thank you. This is genuinely a valid reason even from a simple consistency perspective.

(edit: I think -- after I read some of the links -- I understand why Ollama comes across as less of a hero. Still, I am giving them some benefit of the doubt since they made local models very accessible to plebs like me; and maybe I can graduate to no ollama )

I think this is the thing: if you can use llama.cpp, you probably shouldn't use Ollama. It's designed for the beginner.