Comment by lelanthran
1 day ago
> Sure but we were talking about gaming rigs to run models locally. You are describing some extreme edge folks that are keeping 24/7 work on gaming rigs in your home.
In that scenario the case is even weaker for the rented-hardware model - if you're going to have a gaming rig, you're only paying a little bit more on top for a GPU with more RAM, not the full cost of the rig.
The comparison then is the extra cost of using a 24GB GPU over a standard gaming rig GPU (12GB? 8GB?) versus the cost of renting the GPU whenever you need it.
Honestly not sure what you are talking about.
I could either spend $20 a month for my cursor license.
Or
Spend $2k+ upfront to build a machine to run models locally. Pay for the electricity cost and time to set both the machine and software up.
> Spend $2k+ upfront to build a machine to run models locally.
You said this was in the context of a gaming rig. You're not spending an extra $2k on your gaming rig to run models locally.
If you're building a dedicated LLM machine OR you're using less compute than you are paying the provider for, then, yup - $20/m is cheaper.
When you start using the model more, or if you're already building a gaming rig, then it's going to be cheaper to self-host.
I think the plot flew way over your head. We were comparing costs and for my reading, saying gaming rig is more about consumer grade hardware and not so much an assumption that you already have one. After all I assume we would be buying a 5090 for the vram and at current market price that’s $3k alone. You would probably end up spending at least $20 in electricity and cooling every month of you are running it near 24/7.
So again, the economics don’t really make sense except in specific edge cases or for folks that don’t want to pay vendors. Also please don’t use italics, I don’t know why but every time you see them used it’s always a silly comment.