Comment by vovavili
23 days ago
Mac Minis are perfect for locally running demanding models because they can effectively use ordinary RAM as VRAM.
23 days ago
Mac Minis are perfect for locally running demanding models because they can effectively use ordinary RAM as VRAM.
but people dont use OpenClaw with local models
They definitely do. A common configuration is running a supervisor model in the cloud and a much smaller model locally to churn on long running tasks. This frees Openclaw up to lavishly iterate on tool building without running through too many tokens.
Unless you're running a large local model in 192GB+ this just won't be ideal, based on real-world experience.