← Back to context

Comment by _1

3 days ago

This looks really nice. We've been considering developing something very similar in-house. Are you guys looking at supporting MLC Web LLM, or someother local models?

Yup! We rely on the AI SDK for model routing, and they have an Ollama provider, which will handle pretty much any local model.