Comment by GoatOfAplomb 7 hours ago I wonder if my $20/mo subscription will last 10 minutes. 5 comments GoatOfAplomb Reply mohsen1 6 hours ago At this point, if you're paying out of pocket you should use Kimi or GLM for it to make sense andai 2 hours ago GLM is OK (haven't used it heavily but seems alright so far), a bit slow with ZAI's coding plan, amazingly fast on Cerebras but their coding plan is sold out.Haven't tried Kimi, hear good things. bluerooibos 4 hours ago These are super slow to run locally, though, unless you've got some great hardware - right?At least, my M1 Pro seems to struggle and take forever using them via Ollama. tclancy 6 hours ago Ah ok, same. I keep wondering about how this would ever accomplish anything. simlevesque 6 hours ago I've had good results with Haiku for certain tasks.
mohsen1 6 hours ago At this point, if you're paying out of pocket you should use Kimi or GLM for it to make sense andai 2 hours ago GLM is OK (haven't used it heavily but seems alright so far), a bit slow with ZAI's coding plan, amazingly fast on Cerebras but their coding plan is sold out.Haven't tried Kimi, hear good things. bluerooibos 4 hours ago These are super slow to run locally, though, unless you've got some great hardware - right?At least, my M1 Pro seems to struggle and take forever using them via Ollama.
andai 2 hours ago GLM is OK (haven't used it heavily but seems alright so far), a bit slow with ZAI's coding plan, amazingly fast on Cerebras but their coding plan is sold out.Haven't tried Kimi, hear good things.
bluerooibos 4 hours ago These are super slow to run locally, though, unless you've got some great hardware - right?At least, my M1 Pro seems to struggle and take forever using them via Ollama.
At this point, if you're paying out of pocket you should use Kimi or GLM for it to make sense
GLM is OK (haven't used it heavily but seems alright so far), a bit slow with ZAI's coding plan, amazingly fast on Cerebras but their coding plan is sold out.
Haven't tried Kimi, hear good things.
These are super slow to run locally, though, unless you've got some great hardware - right?
At least, my M1 Pro seems to struggle and take forever using them via Ollama.
Ah ok, same. I keep wondering about how this would ever accomplish anything.
I've had good results with Haiku for certain tasks.