← Back to context

Comment by whs

14 hours ago

Copilot in VSCode is integrated with VSCode's LLM provider API. Which means that any plugins that needs LLM capabilities can submit request to Copilot. Roo Code support that as an option. And of course, there's a plugin that start an OpenAI/Anthropic-compatible web server inside VSCode that just calls to the LLM provider API. It seems that if you use unlimited models (like GPT-4.1) you probably get unlimited API calls. However, those model doesn't seems to be very agentic when used in Claude Code.