Comment by xrd

12 hours ago

Does this indicate running locally with a very small (quantized?) model?

I am very interested in finding ways to combine skills + local models + MCP + aider-ish tools to avoid using commercial LLM providers.

Is this a path to follow? Or, something different?