Comment by tarruda
8 days ago
> Ollama does not use llama.cpp anymore
That is interesting, did Ollama develop its own proprietary inference engine or did you move to something else?
Any specific reason why you moved away from llama.cpp?
8 days ago
> Ollama does not use llama.cpp anymore
That is interesting, did Ollama develop its own proprietary inference engine or did you move to something else?
Any specific reason why you moved away from llama.cpp?
it's all open, and specifically, the new models are implemented here: https://github.com/ollama/ollama/tree/main/model/models