Comment by satvikpendem
1 month ago
The latest Kimi model is comparable in performance at least for these sorts of use cases, but yes it is harder to use locally.
1 month ago
The latest Kimi model is comparable in performance at least for these sorts of use cases, but yes it is harder to use locally.
> harder to use locally
Which means most people must be using OpenClaw connected to Claude or ChatGPT.