Comment by jcattle
2 days ago
Taking a system which was conceptualized about a quarter of a century ago and serves much different needs than what a datacenter in space needs (e.g. very strict thermal band, compared to acceptable temperature range from 20 to 80 degrees) isn't ideal.
The physics is quite simple and you can definitely make it work out. The Stefan Boltzman law works in your favor the higher you can push your temperatures.
If anything a orbital datacenter could be a slightly easier case. Ideally it will be in an orbit which always sees the sun. Most other satellites need to be in the earth shadow from time to time making heaters as well radiators necessary.
These data centers are solar powered, right? So if they are absorbing 100% of the energy on their sun side, by default they'll be able to heat up as much as an object left in the sun, which I assume isn't very hot compared to what they are taking in. How do they crank their temperature up so as to get the Stefan Boltzmann law working in their favor?
I suppose one could get some sub part of the whole satellite to a higher temperature so as to radiate heat efficiently, but that would itself take power, the power required to concentrate heat which naturally/thermodynamically prefers to stay spread out. How much power does that take? I have no idea.
σ is such a small number in Stefan-Boltzman that it makes no difference at all until your radiators get hot enough to start melting.
You not only need absolute huge radiators for a space data centre, you need an active cooling/pumping system to make sure the heat is evenly distributed across them.
I'm fairly sure no one has built a kilometer-sized fridge radiator before, especially not in space.
You can't just stick some big metal fins on a box and call it a day.
Out of curiosity, I plugged in the numbers - I have solar at home, and a 2 m2 panel makes about 500w - i assume the one in orbit will be a bit more efficient without atmosphere and a bit more fancy, making it generate 750w.
If we run the radiators at 80C (a reasonable temp for silicon), that's about 350K, assuming the outside is 0K which makes the radiator be able to radiate away about 1500W, so roughly double.
Depending on what percentage of time we spend in sunlight (depends on orbit, but the number's between 50%-100%, with a 66% a good estimate for LEO), we can reduce the radiator surface area by that amount.
So a LEO satellite in a decaying orbit (designed to crash back onto the Earth after 3 years, or one GPU generation) could work technically with 33% of the solar panel area dedicated to cooling.
Realistically, I'd say solar panels are so cheap, that it'd make more sense to create a huge solar park in Africa and accept the much lower efficiency (33% of LEO assuming 8 hours of sunlight, with a 66% efficiency of LEO), as the rest of the infrastructure is insanely more trivial.
But it's fun to think about these things.
5 replies →