Comment by bigyabai
1 day ago
I really do not suggest ollama. It is slow, missing tons of llama.cpp features and doesn't expose many settings to the user. Koboldcpp is a much better inference provider and even has an ollama-compatible API endpoint.
No comments yet
Contribute on Hacker News ↗