Comment by rob74
4 hours ago
It kinda does make sense if you consider that solar panels in space have been used for a very long time (to power satellites). However, getting the electricity they generate down to Earth is very complicated, so you end up having to use it in space, and one of few things that would make sense for that is indeed data centers, because getting the data to Earth is easier (and Elon already handily has a solution for that).
However I'm curious how many solar panels you would need to power a typical data center. Are we talking something like a large satellite, or rather a huge satellite with ISS-size solar arrays bolted on? Getting rid of the copious amounts of heat that data centers generate might also be a challenge (https://en.wikipedia.org/wiki/Spacecraft_thermal_control)...
> It kinda does make sense if you consider that solar panels in space have been used for a very long time (to power satellites).
It stops making sense the second you ask how you’d dissipate the heat any GPU would create. Sure, you could have vapour chambers. To where? Would this need square kilometers of radiators on top of square kilometers of solar panels? All this just to have Grok in space?
But space is very cold, so no problem there /sarcasm
The plan seems to be for lots and lots of smaller satellites.
For inferencing it can work well. One satellite could contain a handful of CPUs and do batch inferencing of even very large models, perhaps in the beginning at low speeds. Currently most AI workloads are interactive but I can't see that staying true for long, as things improve and they can be trusted to work independently for longer it makes more sense to just queue stuff up and not worry about exactly how high your TTFT is.
For training I don't see it today. In future maybe. But then, most AI workloads in future should be inferencing not training anyway.
A 10MW data center would require square kilometers of solar arrays, even in space.
It’s just as real as the 25k Model 3.
>Getting rid of the copious amounts of heat that data centers generate might also be a challenge
at 70 Celsius - normal for GPU - 1.5m2 radiates something like 1KWt (which requires 4m2 of panels to collect), so doesn't look to a be an issue. (some look to ISS which is a bad example - the ISS needs 20 Celsius, and black body radiation is T^4)
So for the ISS at 20c you'd get 481 W/m^2 so you'd only need 2.3m2. So comparing the ISS at 20c to space datacenters at 70c you get an improvement of 63%. Nice, but doesn't feel game-changing.
The power radiated is T^4, but 70c is only about 17.1% warmer than 20c because you need to compare in kelvin.