Comment by trhway
8 hours ago
>If your radiator is sized for space, it's overkill in an atmosphere.
no. Again totally wrong.
The 20-40C air surrounding the radiator radiates at the radiator too. This is why a human immediately gets stone cold in space while not in the atmosphere - our body radiates away about 900W and receives 800W+ back from the atmosphere - our internal heat 'generation has to cover only the difference - less than 100W usually.
You probably meant forced convection cooling. That requires additional machinery. And that additional machinery is a significant part why ground based datacenters such expensive to build and operate.
To the comment below:
>The planet underneath anything in low orbit also does this, making this argument irrelevant.
no. Again, totally wrong. You've just stated that a human in LEO wouldn't get immediately cold when exposed to space. Just think about it for a second. And after that plug the numbers in thermodynamic calculator. You'll see your error.
>Likewise, the fact that convection exists even without the adjective "forced".
no. Again, wrong. Non-forced convection is pretty small. Use the calculator. And you'll understand why datacenters use forced convection.
The planet underneath anything in low orbit also does this, making this argument irrelevant. There's even cheap paints specifically made to be most emissive in the wavelength window the atmosphere is mostly transparent to rather than itself emitting at.
As does the fact that humans are only slightly warmer than their surroundings. A human-sized object at the operating temperature of a GPU would have a net radiative loss in Earth's atmosphere of around 0.9-1.3 kW.
Likewise, the fact that convection exists even without the adjective "forced". Again, replace a human with an identically shaped android at maximum GPU operating temperatures of 80-100 °C, normal (non-forced) convection goes from ~117 W (human) to 0.9-1.3 kW (80 °C) to 1.2-2 kW (100 °C).
> > The planet underneath anything in low orbit also does this, making this argument irrelevant.
> no. Again, totally wrong. You've just stated that a human in LEO wouldn't get immediately cold when exposed to space. Just think about it for a second. And after that plug the numbers in thermodynamic calculator. You'll see your error.
I already did before previous comment. I was also considering adding "don't forget evaporative cooling for human bodily fluids" to previous comment, but it seemed an irrelevant tangent to discussing data centres.
Now, if you plug the mass of a human and the specific heat capacity of water into a thermodynamic calculator, tell me how long it would take for a human to cool one degree?
https://www.wolframalpha.com/input?i=%2870+Kg+*+%28specific+...
And that's with the 1 kW radiative losses from being in shadow far enough from Earth to not get meaningful thermal radiation from the planet itself. Even at 500 km, thermal radiation from Earth will still add 200 W/m^2. This is comparable to the thermal paint previously mentioned, whose peak emissivity (and by extension absorption) is chosen to be a different wavelength than the thermal emission of air temperature.
> >Likewise, the fact that convection exists even without the adjective "forced".
> no. Again, wrong. Non-forced convection is pretty small. Use the calculator.
I did, for both humans and GPUs, you saw the results. Humans are the wrong reference class.
In your own words, "Just think about it for a second": a human in humid 40°C air is in immediate danger because then all the sources of cooling have been blocked off. Radiation becomes balanced, I said humid to block off evaporation. Conduction and convection there have the same problem there as radiation. A GPU wouldn't have a problem with 40°C ambient, because it will still be radiating heat, conducting heat, and by conducting heat to the air specifically also convecting it away.
many-many words, going sideways and around as you can't go against the basic thermodynamics facts directly. What is your point?
My point, i'll repeat, is that while 80C GPU will still radiate while surrounded by 40C air, it will be receiving back the radiation from the 40C air, whereis in space it will radiate the same while receiving practically nothing back from the environment. Both cases obviously is considered when in shadow.
To the comment below:
>False
you wasted my time as you don't seem to understand the basics of thermodynamics.
>and also irrelevant as if you let the space based ones go into shadow you wasted most of the point of going to space.
again, you wasted my time as you don't understand the datacenter construction discussed in the sibling comments.
from my point of view, ben_w definitely understand thermodynamics better than you. I'll point out that generally speaking radiative heat transfer from air is not particularly significant locally: it only tends to matter in the details when you're dealing with the whole atmosphere, which on average is a lot cooler. The transfer is also not blackbody radiation, so even then you can't really plug the air temperature into a radiative heat transfer calculation and expect a sensible result.
1 reply →
> What is your point?
I do not waste words, perhaps read them and you will find out.
> My point, i'll repeat, is that while 80C GPU will still radiate while surrounded by 40C air, it will be receiving back the radiation from the 40C air, whereis in space it will radiate the same while receiving practically nothing back from the environment. Both cases obviously is considered when in shadow.
False as demonstrated in the words you didn't see the point of, and also irrelevant as if you let the space based ones go into shadow you wasted most of the point of going to space.