← Back to context

Comment by dpkirchner

1 day ago

It'd be easy enough for ollama alternatives -- they just need to make a CLI front end that lets you run a model with reasonable efficiency without passing any flags. That's really ollama's value, as far as I can tell.

Ollama itself doesn't pass that test. (Broken context settings, non-standard formats and crazy model names.)

  • I haven't experienced this personally, but I have stuck with pretty mainstream models like llama, gemma, deepseek, etc.