Comment by calcsam
4 months ago
Yup! We rely on the AI SDK for model routing, and they have an Ollama provider, which will handle pretty much any local model.
4 months ago
Yup! We rely on the AI SDK for model routing, and they have an Ollama provider, which will handle pretty much any local model.
can confirm this works well with any OpenAI like API endpoint, like ollama or LM studio's