← Back to context Comment by nico 4 months ago How are you running the model? Mistral’s api or some local version through ollama, or something else? 3 comments nico Reply layoric 4 months ago Through OpenRouter, medium 3 isn't open weights. kyleee 4 months ago Is mistral on open router? nico 4 months ago Yup https://openrouter.ai/provider/mistralI guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...
kyleee 4 months ago Is mistral on open router? nico 4 months ago Yup https://openrouter.ai/provider/mistralI guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...
nico 4 months ago Yup https://openrouter.ai/provider/mistralI guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...
Through OpenRouter, medium 3 isn't open weights.
Is mistral on open router?
Yup https://openrouter.ai/provider/mistral
I guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...