Comment by tsimionescu
1 year ago
Say you have 3 hosts, each with a capacity of 10 VMs. At some point you have 28 running VMs - 10 on host1, 10 on host2, 8 on host3. Someone then closes down 2 of the VMs on host1, and 7 of the VMs on host3.
Now you have 19 VMs running, but need to keep all 3 hosts powered. If you don't have live VM moving, you are now forced to keep Host3 running only because 1 VM is running on it, even if that VM is idle. So, this one idle VM is responsible for all the energy consumption of host3, and will continue to be so until at least 3 more VMs get started (since you have room for 2 more VMs on host1).
If you did have live VM migration, or if the idle VM were powered down instead of running idle, you could close host3 completely, moving the VM to host 1, and only re-open host3 if needed for new VMs.
This is equivalent to the problem of memory fragmentation. Even though overall usage is low, if host usage is highly fragmented and you aren't allowed to move used memory around (compacting), you can end up consuming far more than actually needed.
Except you don't have 3 hosts but 3 thousands, and during the time you're stopping the 9 VMs somebody else is starting 5 or 15 new ones!
Yes it is similar to memory fragmentation in some way, but your argument is like saying an integer stored on the heap costs a full memory page! You realize that it's nonsense. Sure in extreme edge cases it can, but that's not a good metric to know the memory footprint of an integer!
Being able to move VMs is nice as it allows more host use, but it's doesn't mean hosts end up with single idle VMs often!
Of course, it's not gonna be common. But it will occasionally happen, in such a large data center.
Then you need to account for their low share in idle VMs when measuring how much electricity it is responsible for. If it's only the case for 1% of the idle VMs, then you need to count only 1% of the electric power of a host per idle VM (+ the small fraction of a host CPU power that an idle VM consumes). In any case, it's going to be very small (~$20/year)[1] and the “it costs them nothing” is a good approximation of that, or at least a much better one that assuming that the cost they charge you reflects an expense on their side (which is the point that was argued by rafram at the very start of this discussion.
[1]: let say 10W, which at $.2 per kWh[2], ends up costing $17.5 for an entire year!
[2]: electricity prices from [here](https://www.eia.gov/electricity/monthly/epm_table_grapher.ph...) $.2 per kWh is slightly above the rates in California and Rhode Island, which is the highest in the US for industrial use.