Comment by MadnessASAP
19 hours ago
I ran the math the last time this topic camps up
The short answer is that ~100m2 of steel plate at 1400C (just below its melting point) will shed 50MW of power in black body radiation.
19 hours ago
I ran the math the last time this topic camps up
The short answer is that ~100m2 of steel plate at 1400C (just below its melting point) will shed 50MW of power in black body radiation.
The temperature of space datacenters will be limited to 100 Celsius degrees, because otherwise the electronic equipment will be destroyed.
So your huge metal plate would radiate (1673/374)^4 = 400 times less heat, i.e. only 125 kW.
In reality, it would radiate much less than that, even if made of copper or silver covered with Vantablack, because the limited thermal conductivity will reduce the temperature for the parts distant from the body.
Which GPU runs at 1400C?
One made of steel presumably.
I would assume such a setup involves multiple stages of heat pumps to from GPU to 1400C radiatoe. Obviously that's going to impact efficiency.
Also I'm not seriously suggesting that 1400C radiators is a reasonable approach to cooling a space data centre. It's just intended to demonstrate how infeasible the idea is.
The idea of using heat pumps to increase the temperature of the radiator is unlikely to allow an increase of the fraction of the original amount of heat that is radiated per heatsink surface, i.e. the added heat may be higher than the additionally radiated heat, though I am too lazy to compute now whether this is possible.
Moreover, a heat pump would add an equipment with moving parts that can fail, requiring maintenance.