Ask HN: Locally enabled vibe coding environment?

4 hours ago

Hello, I've been trying to use cursor with locally served Chat GPT or Qwen, but once I get offline it fails. It seems that there once had been a possibility to override API url, but now this is not the case.

Is there any development environment or plugin that you're using for local LLM?

VS Code with Cline is a very analogous solution to Cursor and I can fully recommend it. If you use Ollama there are some models specifically optimized for agent tasks in Cline.

The result is pretty good in my opinion, probably depends on your development cases.