Comment by xandrius
15 hours ago
You're saying it as if privacy was worthless? Also not many people would consider the price of buying a macbook and put it strictly towards running a local model.
Instead if you wanted to get a macbook anyway, you get to run local models for free on top. Very different story.
The privacy angle is not that interesting to me.
- You can find inference providers with whatever privacy terms you're looking for
- If you're using LLMs with real data (let's say handling GMail) then Google has your data anyway so might as well use Gemini API
- Even if you're a hardcore roll-your-own-mail-server type, you probably still use a hosted search engine and have gotten comfortable with their privacy terms
Also on cost the point is you can use an API that's many times smarter and faster for a rounding error in cost compared to your Mac. So why bother with local except for the cool factor?