Comment by simianwords
17 hours ago
ok so you are smarter than all of them? and if they had put you in charge instead of the phd's, they might have been better off?
17 hours ago
ok so you are smarter than all of them? and if they had put you in charge instead of the phd's, they might have been better off?
You're using an uninteresting appeal to authority argument again.
So let's talk physics. Are you familiar with the radiative heat-balance problem? You can use the Stefan–Boltzmann law to calculate how many radiators you'd need.
Required area: A = P / (eps * sigma * eta * (Tr^4 - Tsink^4))
Where:
A = radiator area [m^2]
P = waste heat to dump [W]
eps = emissivity (0..1)
sigma = 5.670374419e-8 W/m^2/K^4
eta = non ideal factor for view/blockage/etc (0..1)
Tr = radiator temperature [K]
Tsink = effective sink temperature [K] (deep space ~3 K, ~0 for Tr sizing)
Assuming best conditions so deep space, eps~0.9, eta~1:
At Tr=300K: ~413 W/m^2
At Tr=350K: ~766 W/m^2
At Tr=400K: ~1307 W/m^2
So for 10 MW at 350K (basically around 77°C): A ~ 1e7 / 766 ≈ 13,006 m^2 (best case).
And even in the best case scenario it's only 10 MW and we're not counting radiation from the sun or IR from the moon/earth etc. so in real life, it will be even higher.
You can build 10 MW nuclear power plant (microreactor) with the datacenter included on Earth for the same price.
Show me your numbers or lay out a plan for how to make it economically feasible in space.
you are saying you can stop an entire division in google, nvidia, blue origin with this bit of theory?
like all the employees had to do with read this and be like: wow i never saw it that way.