← Back to context

Comment by eadmund

2 days ago

> I'd assume that a modern CPU would do the same amount of work with a fraction of energy so that it does not even make economical sense to run such outdated hardware.

There are 8,760 hours in a non-leap year. Electricity in the U.S. averages 12.53 cents per kilowatt hour[1]. A really power-hungry CPU running full-bore at 500 W for a year would thus use about $550 of electricity. Even if power consumption dropped by half, that’s only about 10% of the cost of a new computer, so the payoff date of an upgrade is ten years in the future (ignoring the cost of performing the upgrade, which is non-negligible — as is the risk).

And of course buying a new computer is a capital expense, while paying for electricity is an operating expense.

1: https://www.eia.gov/electricity/monthly/epm_table_grapher.ph...

You can buy a mini pc for less than $550. For $200 on Amazon you can get an N97 based box with 12 GB RAM and 4 cores running at 3 GHz and a 500 GB SATA SSD. That’s got to be as fast as their current build systems and supports the required instructions.

  • Those single memory channel shitboxes aren't even fast enough to be usable during big windows updates let alone used in production.

    • One channel of DDR5-4800 actually competes pretty well against four channels of DDR3-1333 spread across two chiplets, which was the best Opteron configuration old enough to not have SSE4.1.

  • if you don't understand bandwidths and how long componenets can run at the 80pctile before failure, you're out of your element in this discussion.