← Back to context

Comment by cbsmith

4 days ago

So the idea is that it SHOULD cost OpenAI a trillion dollars to do what you can accomplish with a potato?

No, not even sure how you arrived to that conclusion. The idea is that there are models out there that can run on small amounts of VRAM. If all it costs is charging your phone, as opposed to some subscription to some overvalued AI company, people will choose ‘free’ first. We have models that can google things now. They only need to know so much when online, and a specific subset when offline.

  • I think there are lots of advantages to running a model locally. Saving money is one of them, but that's only if you can keep the thing busy. You wisely put the "free" in quotes for a reason: you paid money for the hardware the model is running on, and you're paying for the electrical bill to power it too. Even if you pay a 100% markup to the cloud, unless you're keeping it busy 50% of the time, it's cheaper to rent.