← Back to context Comment by mohsen1 3 hours ago At this point, if you're paying out of pocket you should use Kimi or GLM for it to make sense 1 comment mohsen1 Reply bluerooibos 1 hour ago These are super slow to run locally, though, unless you've got some great hardware - right?At least, my M1 Pro seems to struggle and take forever using them via Ollama.
bluerooibos 1 hour ago These are super slow to run locally, though, unless you've got some great hardware - right?At least, my M1 Pro seems to struggle and take forever using them via Ollama.
These are super slow to run locally, though, unless you've got some great hardware - right?
At least, my M1 Pro seems to struggle and take forever using them via Ollama.