Comment by mikro

2 months ago

I've been waiting for someone to dig into this more deeply! Looking forward to Part 2!

I use both Claude Code and Xcode with a local LLM (running with LM Studio) and I noticed they both have system prompts that make it work like magic.

If anyone reading this interested in setting up Claude Code to run offline, I followed these instructions:

https://medium.com/@luongnv89/setting-up-claude-code-locally...

My personal LLM preference is for Qwen3-Next-80B with 4bit quantization, about ~45GB in ram.

Thanks — glad it resonated! Part 2 should uncover a lot of the magic behind the scenes. And thanks for sharing the link. Running claude code against a local llm is a really interesting direction, but I need more RAM...