Comment by calcsam
3 days ago
Yup! We rely on the AI SDK for model routing, and they have an Ollama provider, which will handle pretty much any local model.
3 days ago
Yup! We rely on the AI SDK for model routing, and they have an Ollama provider, which will handle pretty much any local model.
can confirm this works well with any OpenAI like API endpoint, like ollama or LM studio's