Comment by bbshfishe
2 hours ago
Yeah no it didn’t. If you have a fully speced out M3/4 MacBook with enough memory you’re running pretty decent models locally already. But no one is using local models anyway.
2 hours ago
Yeah no it didn’t. If you have a fully speced out M3/4 MacBook with enough memory you’re running pretty decent models locally already. But no one is using local models anyway.
I run a local model on the daily. I have it making tickets when certain emails come in and made a small that I can click to approve ticket creation. It follows my instructions and has a nice chain of thought process trained. Local LLMs are starting to become very useful. Not OpenClaw crap.
> Yeah no it didn’t
What is "it" and what didn't it it do?
With OpenClaw and powerful local models like Kimi 2.5, these specs make a lot of sense.
K2.5 isn't remotely a local model