Comment by gazarsgo
6 days ago
I dunno I ran `ollama run gpt-oss:20b` locally and it only used 16GB locally and I had decent enough inference on my Macbook.
6 days ago
I dunno I ran `ollama run gpt-oss:20b` locally and it only used 16GB locally and I had decent enough inference on my Macbook.
Now do the 120b model.
[dead]