Comment by ggm
11 hours ago
As long as it's unaccounted for by users it's at best anexternality. I think it may demand regulation to force this cost to the surface.
electricity and cooling incur wider costs and consequences.
11 hours ago
As long as it's unaccounted for by users it's at best anexternality. I think it may demand regulation to force this cost to the surface.
electricity and cooling incur wider costs and consequences.
That's hardly unique to data centers.
I'm all for regulation that makes businesses pay for their externalities - I'd argue that's a key economic role that a government should play.
No disagree. I think all enterprises that come with community borne costs (noise, heat, energy use, road use, construction & infrastructure, state incentives) and benefits (tax revenue, jobs) should have some level of accounting. It would be wrong to say the negs always outweigh the positives, thats not the point here. The point is that a bunch of cost in DC relating to power and cooling wind up having impact on the community at large.
I've been told in other (non US) economies, decisions to site hyperscaler DCs has had downstream impacts on power costs and longterm power planning. The infra to make a lot of power appear at a site, means the same capex and inputs cannot be used to supply power to towns and villages. There's a social opportunity loss in hosting the DC because the power supply doesn't magicly make more transformers and wires and syncons appear on the market: Prices for these things are going up because of a worldwide shortage.
Its like the power version of worldwide RAM pricing.
> As long as it's unaccounted for by users it's at best anexternality.
Why is it an externality? Anthropic (or other model provider) pays the electricity cost, then it's passed along in the subscription or API bill. The direct cost of the energy is fully internalized in the price.
Electricity supply is about more than supply cost. It has a build cost, with assumed inputs which are in turn priced by demand, and they're in short supply worldwide. DC costs of construction in power are pushing up the delay and cost for non-DC power projects.
No! What a disaster. Enforce the costs of externalities as close to the source as possible. If electricity costs more money charge more money for electricity. Don’t add some “regulation” to force end users to pay more based on your estimate of how much more you think electricity should cost.
I said force the cost to the surface. Not set a fixed rate of return: I want the accounting on the build and supply cost at large, as well as the KW charges to be accounted for.
I mean if we really cared about this one bit we'd stop making a car based society in the US and save far more energy and pollution. That's not politically expedient and there are powerful vested interests in ensuring it doesn't happen.
That's why I think most of this data center energy use, especially over longer terms is a joke. Data center can pretty easily run on solar and wind energy if we spend even a small amount to political capital to make it happen.
They're a lot harder to build out from those sources because they are costed to run 24/7 and the intermittency issue comes to the fore. Unlike things like aluminium smelters there isn't always a good load-shed or even supply-timing story in a DC, cooling aside (big chunks of cooling can be used for demand management)
I am not in the DC business. if somebody who is says "thats bunkum" I'd pay attention to it.
I don't see how this follows. Data center operators buy energy and this is almost their only operating expense. Their products are priced to reflect this. The fact that basic AI features are free reflects the fact that they use almost no energy.
I would be surprised if AI prices reflect their current cost to provide the service, even inference costs. With so much money flowing into AI the goal isn't to make money, it's to grow faster than the competition.
From this article:
> For the purposes of this post, I’ll use the figures from the 100,000 “maximum”–Claude Sonnet and Opus 4.5 both have context windows of 200,000 tokens, and I run up against them regularly–to generate pessimistic estimates. So, ~390 Wh/MTok input, ~1950 Wh/MTok output.
Expensive commercial energy would be 30¢ per kWh in the US, so the energy cost implied by these figures would be about 12¢/MTok input and 60¢/MTok output. Anthropic's API cost for Opus 4.5 is $5/MTok input and $25/MTok output, nearly two orders of magnitude higher than these figures.
The direct energy cost of inference is still covered even if you assume that Claude Max/etc plans are offering a tenfold subsidy over the API cost.
1 reply →
I remain confident that most AI labs are not selling API access for less than it costs to serve the models.
If that's so common then what's your theory as to why Anthropic aren't price competitive with GPT-5.2?
1 reply →
> I would be surprised if AI prices reflect their current cost to provide the service, even inference costs.
This has been covered a lot. You can find quotes from one of the companies saying that they'd be profitable if not for training costs. In other words, inference is a net positive.
You have to keep in mind that the average customer doesn't use much inference. Most customers on the $20/month plans never come close to using all of their token allowance.