← Back to context

Comment by foobarian

2 months ago

The only advantage I can come up with is the background temperature being much colder than Earth surface. If you ignored the capex cost to get this launched and running in orbit, could the cooling cost be smaller? Maybe that's the gimmick being used to sell the idea. "Yes it costs more upfront but then the 40% cooling bill goes away... breakeven in X years"

Strictly speaking, the thermosphere is actually much warmer than the atmosphere we experience--on the order of 100's or even a 1000 degrees Celsius, if you're measuring by temperature (the average kinetic energy of molecules). However, since particle density is so low, the number of molecules is quite low, and so total heat content of the thermosphere is low. But since particle count is low, conduction and convection are essentially nonexistent, which means cooling needs to rely entirely on radiation, which is much less efficient than other modes at cooling.

In other words, a) background temperature (to the extent it's even meaningful) is much warmer than Earth's surface and b) cooling is much, much more difficult than on Earth.

  • Technically radiation cooling is 100% efficient. And remarkably effective, you can cool an inert object to the temperature of the CMBR (4K) without doing anything at all. However it is rather slow and works best if there's no nearby planets or stars.

    Fun fact though, make your radiator hotter and you can dump just as much if not more energy then you would typically via convective cooling. At 1400C (just below the melting point of steel) you can shed 450kW of heat per square meter, all you need is a really fancy heat pump!

Is it an advantage though ? One of the main objections in the article is exactly that.

There's no atmosphere that helps with heat loss through convection, there's nowhere to shed heat through conduction, all you have is radiation. It is a serious engineering challenge for spacecrafts to getting rid of the little heat they generate, and avoid being overheated by the sun.

  • I think it is an advantage, the question is just how big, and assume we look only at ongoing operation cost.

    - Earth temperatures are variable, and radiation only works at night

    - The required radiator area is much smaller for the space installation

    - The engineering is simple: CPU -> cooler -> liquid -> pipe -> radiator. We're assuming no constraint on capex so we can omit heat pumps

    • A typical CPU heatsink dissipates 10-30% of heat through radiation, and the rest through convection. In space you're in a vacuum so you can't disipated heat through convection.

      You need to rework your physical equipment quite substantially to make up for the fact you can't shed 70-90% of the heat in the same manner as you can down here on Earth

But the cooling cost wouldn’t be smaller. There’s no good way to eliminate the waste heat into space. It’s actually far far harder to radiate the waste heat into space directly than it would be to get rid of it on Earth.

  • Which is why vacuum flask for hot/cold drinks are a thing/work. Empty space is a pretty good insulator as it turns out.

    It’s a little worrying so many don’t know that.

  • I don't know about that. Look at where the power goes in a typical data center, for a 10MW DC you might spend 2MW just to blow air around. A radiating cooler in space would almost eliminate that. The problem is the initial investment is probably impractical.

    • >99.999% of the power put into compute turns into heat, so you're going to need to reject 8 MW of power into space with pure radiation. The ISS EATCS radiators reject 0.07 MW of power in 85 sq. m, so you're talking about 9700 sq. m of radiators, or bigger than a football field/pitch.

      1 reply →

This question is thoroughly covered in the linked article.

  • Pardon, but the question of "could the operational cost be smaller in space" is almost not touched at all in the article. The article mostly argues that designing thermal management systems for space applications is hard, and that the radiators required would be big, which speaks to the upfront investment cost, not ongoing opex.

    • Ok, sure, technically. To be fair you can't really assess the opex of technology that doesn't exist yet, but I find it hard to believe that operating brand new, huge machines that have to move fluid around (and not nice fluids either) will ever be less than it is on the surface. Better hope you never get a coolant leak. Heck, it might even be that opex=0 still isn't enough to offset the "capex". Space is already hard when you're not trying to launch record-breaking structures.

      Even optimistically, capex goes up by a lot to reduce opex, which means you need a really really long breakeven time, which means a long time where nothing breaks. How many months of reduced electricity costs is wiped out if you have to send a tech to orbit?

      Oh, and don't forget the radiation slowly destroying all your transistors. Does that count as opex? Can you break even before your customers start complaining about corruption?

      2 replies →

Things on earth also have access to that coldness for about half of each day. How many data centers use radiative cooling into the night sky to supplement their regular cooling? The fact that the answer is “zero” should tell you all you need to know about how useful this is.

  • The atmosphere is in the way even at night, and re-radiates the energy. The effective background temperature is the temperature of the air, not to mention it would only work at night. I think there would need to be like 50-ish acres of radiators for a 50MW datacenter to radiate from 60 to 30C. This would be a lot smaller in space due to bigger temp delta. Either way opex would be much much less than average Earth DC (PUE almost 1 instead of run-of-the mill 1.5 or as low as 1.1 for hyperscalers). But yeah the upfront cost would be immense.

    • I think you’re ignoring a huge factor in how radiative cooling actually works. I thought the initial question was fine if you hadn’t read the article but understand the downvotes due to doubling down. Think of it this way. Why do thermoses have a vacuum sealed chamber between two walls in order to insulate the contents of the bottle? Because a vacuum is a fucking terrible heat convector. Putting your data center into space in order to cool it is like putting a computer inside of a thermos to cool it. It makes zero fucking sense. There is nowhere for the heat to actually radiate to so it stays inside.

      4 replies →

  • Look up Tech Ingredients episode on Radiative Paint.

    The fact that people aren’t using something isn’t evidence that it’s not possible or even a great idea, it could be that a practical application didn’t exist before or someone enterprising enough hasn’t come along yet.

    • When something has been known for millennia and hasn’t been put to a particular use even after decades where it could have been used, that is pretty good evidence that this use isn’t a good idea. Especially when it’s something really simple.

      Radiative cooling is great for achieving temperature a bit below ambient at night when you don’t have any modern refrigeration equipment. That’s about all. It’s used in space applications because it’s literally the only option.

Breakeven in X years probably makes sense for storage (slow depreciation), not GPUs (depreciates in like 4 years)

  • I think by far the most mass in this kind of setup would go into the heat management, which could probably last a long time and could be amortized separately from the electronics.

    • How would the radiators be useful if the electronics no longer are? Unless you can repurpose the radiators once the electronics are useless, which you can't in space, then the radiators' useful lifetime is hard limited by the electronics' lifetime.