Comment by lelanthran
1 day ago
> Spend $2k+ upfront to build a machine to run models locally.
You said this was in the context of a gaming rig. You're not spending an extra $2k on your gaming rig to run models locally.
If you're building a dedicated LLM machine OR you're using less compute than you are paying the provider for, then, yup - $20/m is cheaper.
When you start using the model more, or if you're already building a gaming rig, then it's going to be cheaper to self-host.
I think the plot flew way over your head. We were comparing costs and for my reading, saying gaming rig is more about consumer grade hardware and not so much an assumption that you already have one. After all I assume we would be buying a 5090 for the vram and at current market price that’s $3k alone. You would probably end up spending at least $20 in electricity and cooling every month of you are running it near 24/7.
So again, the economics don’t really make sense except in specific edge cases or for folks that don’t want to pay vendors. Also please don’t use italics, I don’t know why but every time you see them used it’s always a silly comment.