← Back to context Comment by nico 2 months ago How are you running the model? Mistral’s api or some local version through ollama, or something else? 3 comments nico Reply layoric 2 months ago Through OpenRouter, medium 3 isn't open weights. kyleee 2 months ago Is mistral on open router? nico 2 months ago Yup https://openrouter.ai/provider/mistralI guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...
kyleee 2 months ago Is mistral on open router? nico 2 months ago Yup https://openrouter.ai/provider/mistralI guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...
nico 2 months ago Yup https://openrouter.ai/provider/mistralI guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...
Through OpenRouter, medium 3 isn't open weights.
Is mistral on open router?
Yup https://openrouter.ai/provider/mistral
I guess it can't really be run locally https://www.reddit.com/r/LocalLLaMA/comments/1kgyfif/introdu...