← Back to context

Comment by tsimionescu

1 year ago

Of course, it's not gonna be common. But it will occasionally happen, in such a large data center.

Then you need to account for their low share in idle VMs when measuring how much electricity it is responsible for. If it's only the case for 1% of the idle VMs, then you need to count only 1% of the electric power of a host per idle VM (+ the small fraction of a host CPU power that an idle VM consumes). In any case, it's going to be very small (~$20/year)[1] and the “it costs them nothing” is a good approximation of that, or at least a much better one that assuming that the cost they charge you reflects an expense on their side (which is the point that was argued by rafram at the very start of this discussion.

[1]: let say 10W, which at $.2 per kWh[2], ends up costing $17.5 for an entire year!

[2]: electricity prices from [here](https://www.eia.gov/electricity/monthly/epm_table_grapher.ph...) $.2 per kWh is slightly above the rates in California and Rhode Island, which is the highest in the US for industrial use.