← Back to context

Comment by gguncth

2 days ago

I have no desire to run an LLM on my laptop when I can run one on a computer the size of six football fields.

The point is that when you run it on your own hardware you can feed the model your health data, bank statements and private journals and can be 5000% sure they’re not going anywhere

  • Regular people don't understand nor care about any of that. They'll happily take the Faustian bargain.

    • It only needs one highly public breach and there's going to be a full-on business for someone selling a local-only AI processor for homes.

      Combine it with a media player like an Apple TV or Nvidia Shield and people might buy it.

I've been playing around with my own home-built AI server for a couple months now. It is so much better than using a cloud provider. It is the difference between drag racing in your own car, and renting one from a dealership. You are going to learn far more doing things yourself. Your tools will be much more consistent and you will walk away with a far greater understanding of every process.

A basic last-generation PC with something like a 3060ti (12GB) is more than enough to get started. My current rig pulls less than 500w with two cards (3060+5060). And, given the current temperature outside, the rig helps heat my home. So I am not contributing to global warming, water consumption, or any other datacenter-related environmental evil.