← Back to context

Comment by jmyeet

7 hours ago

I suspect uou've misread that document. It is a good document though. It's saying a large parts plant uses ~188,000 MWh, I think per year.

A modern AI data center uses 20-100MW+ of electricity. Those two things aren't the same. 20MW of continuous electricity use (which AI data centers do) translates to 175,000 MWh of electricity per year. That's about the same as a minimum and might be 5+ times more.

This document is only about energy usage so we have to guess what "large" means in terms of employment but 3000 to 7000 seems to the range. Compared to 20-30.

But AI data centers are worse because they actually produce what I call negative jobs. Their currently only value proposition is in laying off people and otherwise suppressing labor costs. All while the residents all pay more for their electricity with the money no longer have because they got laid off.

> A modern AI data center uses 20-100MW+ of electricity.

I understand the high end builds to have exceeded 100 kW per rack at this point, with the largest sites exceeding 1 GW (ie 10x your upper bound). So the smallest datacenters use as much as the largest auto plants, and the largest datacenters use 100x that.