← Back to context Comment by mohsen1 12 hours ago At this point, if you're paying out of pocket you should use Kimi or GLM for it to make sense 3 comments mohsen1 Reply andai 8 hours ago GLM is OK (haven't used it heavily but seems alright so far), a bit slow with ZAI's coding plan, amazingly fast on Cerebras but their coding plan is sold out.Haven't tried Kimi, hear good things. bluerooibos 10 hours ago These are super slow to run locally, though, unless you've got some great hardware - right?At least, my M1 Pro seems to struggle and take forever using them via Ollama. corysama 4 hours ago Try this https://unsloth.ai/docs/models/qwen3-coder-next
andai 8 hours ago GLM is OK (haven't used it heavily but seems alright so far), a bit slow with ZAI's coding plan, amazingly fast on Cerebras but their coding plan is sold out.Haven't tried Kimi, hear good things.
bluerooibos 10 hours ago These are super slow to run locally, though, unless you've got some great hardware - right?At least, my M1 Pro seems to struggle and take forever using them via Ollama. corysama 4 hours ago Try this https://unsloth.ai/docs/models/qwen3-coder-next
GLM is OK (haven't used it heavily but seems alright so far), a bit slow with ZAI's coding plan, amazingly fast on Cerebras but their coding plan is sold out.
Haven't tried Kimi, hear good things.
These are super slow to run locally, though, unless you've got some great hardware - right?
At least, my M1 Pro seems to struggle and take forever using them via Ollama.
Try this https://unsloth.ai/docs/models/qwen3-coder-next