Comment by dpe82
9 hours ago
Power is not the most expensive part of data center lifetime cost; especially these days when you're filling them with several billion dollars of nvidia chips. It's still an important consideration of course, but not the only one.
I don't know if that's really true. Given realistic life cycles of equipment (~10 years, not 3 as commonly believed) the operating power is going to be 75-80% of the TCO, or more.
I don't see how that number could possibly be realistic.
A H100 cost 30k when new, and uses 500W of power.
500W for a year is about 4500kWh, which at $0.10/kWh is $450/year if run at full utilization (unrealistic).
TCO of an AI data center should be entirely dominated by capex depreciation.
In fairness your calculation looks at the most expensive element of the DC but ignores all of the associated parts required to utilize the H100: CPU, memory, cooling, etc. No to say that that flips the calculation (I don't have the answer), but it does leave a lot of power out.