← Back to context

Comment by russdill

3 days ago

If your computer is still doing bursty jobs during that period, it will use less power but still as much energy. Sure, you can reduce the power but if you aren't also reducing what you ask it to do, it'll just use that max amount of allowed power for a longer period of time.

All the modern CPUs will boost into high clockspeeds and voltage to get work done quicker but at considerably higher power draws per operation. On that side of the equation its clear that it uses more energy. The problem is the entire CPU package is on longer if you don't do that and this costs power too and so its a trade off between the two. Generally we consider there isn't much difference between them but I don't know about that having seen the insanity that was the 13th and 14th gen Intel's consuming 250W when 120W gets about 95% the performance I think its very likely moving down to power save and avoiding that level of boosting definitely saves small amounts of power.

  • This is some pretty old analysis, but I remember when smartphones came out and people were thinking about throttling their applications to lower power consumption the general advice was to just "race to idle".

    The consensus thus was that spending more time in lower power states (where you use ~0W) was much more efficient than spending a longer amount of time in the CPU sweetspot, but with all sort of peripherals online that you didn't need anyway.

    I remember when Google made a big deal out of "bundling" idle CPU and network requests, since bursting them out was more efficient than having the radio and CPU trotting along at low bandwidth.

    • However there are two factors that might make "race to idle" more valid on phones than on most other platforms:

      Smartphone chips are designed to much stricter thermal and power limits. There is only so much power you can get out of the tiny battery, and only so much heat you can get rid of without making the phone uncomfortably hot. Even in a burst that puts a limit on the wastefulness. Desktop CPUs are very different: If you can get 10% more performance while doubling power draw, people will just buy bigger coolers and bigger power supplies to go along with the CPU. Notebook CPUs are somewhere in the middle: limited, but with much better cooling and much more powerful batteries than phones.

      The other thing is the operating system: "race to idle" makes sense in phones because the OS will actually put the CPU into sleep states if there's nothing to do, and puts active effort into not waking the CPU up unnecessarily and cramming work into the time slots when the CPU is active anyways. Desktop operating systems just don't do that to the same degree. You might race to idle, but the rest of the system will then just waste power with background work once it's idle.

      2 replies →

    • Race to idle probably makes more sense in the context of smartphones where there’s at least some chance that “idle” means the screen might be turned off.

      For a desktop, the usage… I mean, it is sort of different really. If I’m writing a Tex file for example, slower compiles mean I’ll get… fewer previews. The screen is still on. More previews is vaguely useful, but probably doesn’t substantially speed up the rate at which I write—the main bottleneck is somewhere between my hat and my hands, I think.

It is well known in the PC hardware enthusiast community that the last few digits of percent of performance come at enormous increases in power consumption as voltages are raised to prevent errors as clock speeds go up.

Manufacturers chase benchmark results by youtubers and magazines. Even a few percent difference in framerate means the difference between everyone telling each other to buy a particular motherboard, processor, or graphics card over another.

Amusingly, you often get better performance by undervolting and lowering the processor's power limits. This keeps temperatures low and thus you don't end up with the PC equivalent of the "toyota supra horsepower chart" meme.

1400W for a desktop PC is...crazy. That's a threadripper processor plus a bleeding edge top of the line GPU, assuming that's not just them reading off the max power draw on the nameplate of the PSU.

If their PC is actually using that much power, they could save far more money, CO2, etc by undervolting both the CPU and GPU.

  • 1400 is definitely the sticker on the side of the PSU. There is some theory behind keeping your PSU at 30-50% for optimal efficiency, but considering the cost of these 1k+ W units You're probably better off right-sizing it.

  • I myself massively overspec my PSU's for my builds as I want ti keep them in the optimal efficiency range rather than pushing their limits. For a typical 800W budget I usually go with a tier1 1200W offering.

  • I'm actually using a 1600W PSU. 1400W is my target max draw. This is a dual EPYC (64 core CPU each) system btw. The max draw by the CPU+MB+Drives running at peak 3700MHz without the GPU is 495W! Adding 4x 4090 (underclocked) will quickly get you to 1400W+.

As with everything, it depends. If you are going to do the same jobs regardless of the amount of time it takes, then yeah, dropping the max power probably just spreads the energy use over time. That doesn't usually help you save money, unless you have a very interesting residential plan.

OTOH, if it's something like realtime game rendering without a frame limiter, throttling would reduce the frame rate, reducing the total amount of work done, and most likely the total energy expended.