Comment by OneDeuxTriSeiGo
8 hours ago
It's worth noting that the EACTS can at maximum dissipate 70kW of waste heat. And EEACTS (the original heat exchange system) can only dissipate another 14kW.
That is together less than a single AI inference rack.
And to achieve that the EACTS needs 6 radiator ORUs each spanning 23 meters by 11 meters and with a mass of 1100 kg. So that's 1500 square meters and 6 and a half metric tons before you factor in any of the actual refrigerant, pumps, support beams, valve assemblies, rotary joints, or cold side heat exchangers all of which will probably together double the mass you need to put in orbit.
There is no situation where that makes sense.
-----------
Manufacturing in space makes sense (all kinds of techniques are theoretically easier in zero G and hard vacuum).
Mining asteroids, etc makes sense.
Datacenters in space for people on earth? That's just stupid.
Your calculations are based on cooling to 20c, which is exponentially harder than cooling to 70c where GPUs are happy. Radiators would be roughly 1/3 the size of the panels for 70c.
> Datacenters in space for people on earth? That's just stupid.
But if completes the vision of ancestors who thought god living in the sky
So "Lord give me a sign from heavens" may obtain a whole new meaning
I'm a total noob on this.
I get that vacuum is a really good insulator, which is why we use it to insulate our drinks bottles. So disposing of the heat is a problem.
Can't we use it, though? Like, I dunno, to take a really stupid example: boil water and run a turbine with the waste heat? Convert some of it back to electricity?
What do you do with the steam afterwards? If you eject it, you have to bring lots of it with your spacecraft, and that costs serious money. If you let it condensate to get water again, all you did is moving some heat inside the spacecraft, almost certainly creating even more heat when doing that.
It's a good question, but in a closed system (like you have in space) the heat from the turbine loop has to go somewhere in order to make it useful. Let's say you have a coolant loop for the gpus (maybe glycol). You take the hot glycol, run it through your heat exchanger and heat up your cool, pressurized ammonia. The ammonia gets hot (and now the glycol is cool, send it back). You then take the ammonia and send it through the turbine and it evaporates as it expands and loses pressure to spin the turbine. But now what? You have warm, vaporized, low pressure ammonia, and now you need to cool it down to start over. Once it's cool you can pressurize it again so you can heat it up to use again, but you have to cool it, and that's the crux of the issue.
The problem is essentially that everything you do releases waste heat, so you either reject it, or everything continues to heat up until something breaks. Developing useful work from that heat only helps if it helps reject it, but it's more efficient to reject it immediately.
A better, more direct way to think about this might be to look at the Seebeck effect. If you have a giant radiator, you could put a Peltier module between it and you GPU cooling loop and generate a little electricity, but that would necessarily also create some waste heat, so you're better off cooling the GPU directly.
You can't easily use low grade heat.
However there are workarounds. People are talking like the only radiator design is the one on the ISS. There are other ways to build radiators. It's all about surface area. One way is to heat up a liquid and then spray it openly into space on a level trajectory towards a collecting dish. Because the liquid is now lots of tiny droplets the surface area is huge, so they can radiate a lot of heat. You don't need a large amount of material as long as you can scoop up the droplets the other end of the "pipe" and avoid wasting too much. Maybe small amounts of loss are OK if you have an automated space robot that goes around docking with them and topping them up again.
Harder to direct waste heat in space if you dont have gravity for convection.