Comment by davely
6 months ago
For what it's worth, HuggingFace provides documentation on how you can run any GGUF model inside Ollama[0]. You're not locked into their closed library or have to wait for them to add new models.
Granted, they could be a lot more helpful in providing information on how you do this. But this feature exists, at least.
No comments yet
Contribute on Hacker News ↗