← Back to context Comment by ttyprintk 3 months ago https://huggingface.co/simplescaling 9 comments ttyprintk Reply anentropic 3 months ago and: https://github.com/simplescaling/s1 mettamage 3 months ago When you're only used to ollama, how do I go about using this model? davely 3 months ago I think we need to wait for someone to convert it into a GGUF file format.However, once that happens, you can run it (and any GGUF model) from Hugging Face![0][0] https://huggingface.co/docs/hub/en/ollama 6 replies →
anentropic 3 months ago and: https://github.com/simplescaling/s1 mettamage 3 months ago When you're only used to ollama, how do I go about using this model? davely 3 months ago I think we need to wait for someone to convert it into a GGUF file format.However, once that happens, you can run it (and any GGUF model) from Hugging Face![0][0] https://huggingface.co/docs/hub/en/ollama 6 replies →
mettamage 3 months ago When you're only used to ollama, how do I go about using this model? davely 3 months ago I think we need to wait for someone to convert it into a GGUF file format.However, once that happens, you can run it (and any GGUF model) from Hugging Face![0][0] https://huggingface.co/docs/hub/en/ollama 6 replies →
davely 3 months ago I think we need to wait for someone to convert it into a GGUF file format.However, once that happens, you can run it (and any GGUF model) from Hugging Face![0][0] https://huggingface.co/docs/hub/en/ollama 6 replies →
and: https://github.com/simplescaling/s1
When you're only used to ollama, how do I go about using this model?
I think we need to wait for someone to convert it into a GGUF file format.
However, once that happens, you can run it (and any GGUF model) from Hugging Face![0]
[0] https://huggingface.co/docs/hub/en/ollama
6 replies →