← Back to context Comment by nico 1 day ago How are you running the model? Mistral’s api or some local version through ollama, or something else? 3 comments nico Reply layoric 1 day ago Through OpenRouter, medium 3 isn't open weights. kyleee 1 day ago Is mistral on open router? nico 1 day ago Yup https://openrouter.ai/provider/mistralI guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...
kyleee 1 day ago Is mistral on open router? nico 1 day ago Yup https://openrouter.ai/provider/mistralI guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...
nico 1 day ago Yup https://openrouter.ai/provider/mistralI guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...
Through OpenRouter, medium 3 isn't open weights.
Is mistral on open router?
Yup https://openrouter.ai/provider/mistral
I guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...