> In the right orbit, a solar panel can be up to 8 times more productive than on earth, and produce power nearly continuously, reducing the need for batteries.
Sure. Now do cooling. That this isn't in the "key challenges" section makes this pretty non-serious.
This is absolutely the first thing I looked for too. They just barely mentioned thermal management at all. Maybe they know something I don't, but I know from past posts here that many people share this concern. Very strange that they didn't go there, or maybe they didn't go there because they have no solution and this is just greenwashing for the costs of AI.
No, they just literally assumed their design fits withing the operational envelope of a conventional satellite - the paper (which no one read, apparently) literally says their system design "assumes a relatively conventional, discrete compute payload, satellite bus, thermal radiator, and solar panel designs".
This is not the 1960s. Today, if you have an idea for doing something in space, you can start by scoping out the details of your mission plan and payload requirements, and then see if you can solve it with parts off a catalogue.
(Of course there's million issues that will crop up when actually designing and building the spacecraft, but that's too low level for this kind of paper, which just notes that (the authors believe) the platform requirements fall close enough to existing systems to not be worth belaboring.)
How much are you ready to bet against Elon's plans to scale up Starlink v3 for GPUs? Starlink v3 already has a 60M length solar array, so they're already solving dissipation for that size. Assume linear scaling to many thousands of modules.
"Starship could deliver 100GW/year to high Earth orbit within 4 to 5 years if we can solve the other parts of the equation. 100TW/year is possible from a lunar base producing solar-powered AI satellites locally and accelerating them to escape velocity with a mass driver."
Cooling area seems similar to generation area, so maybe less than a key challenge?
GPT says 1000 W at 50 C takes about 3 m^2 to radiate (edge on to Earth and Sun), and generating that 1000 W takes about... 3 m^2 of solar panel. The panel needs its backside radiator clear to keep itself coolish (~100 C), so it does need to be a separate surface. Spreading a 1000 W point source across a 3 m^2 tile (or half that if two-sided?) is perhaps not scary, even with weight constraints?
Hmm, from an order-of-magnitude perspective, it looks like an (L shaped) Starlink v2 sat has 100 m^2 of panel, low 10 kW draw, and a low 100 m^2 body area. And there are 10 k of them. So want something bigger. A 100 x 100 m sheet might get you 10 sats per 100,000 GPU data center.
Regards ISS, ISS has its big self, basking in the sunlight, needing to be cooled. Versus "the only thing sun-lit is panel".
Point solar panels away from the Sun and they work as rudimentary radiators :).
More seriously though, the paper itself touches on cooling and radiators. Not much, but that's reasonable - cooling isn't rocket science :), it's a solved problem. Talking about it here makes as much sense as taking about basic attitude control. Cooling the satellite and pointing it in the right direction are solved problems. They're important to detail in full system design, but not interesting enough for a paper that's about "data centers, but in space!".
Cooling at this scale in space is very much not a solved problem. Some individual datacenter racks use more power than the entire ISS cooling system can handle.
It's solved on Earth because we have relatively easy (and relatively scalable) ways of getting rid of it - ventilation and water.
I'm completely puzzled on why space-based compute is so exciting to everyone all of a sudden. I have worked on spacecraft and the constant power benefit seems comically far from outweighing the many, many negatives, even if launch cost is zero, which we are still very far from.
Am I missing something? Feels like an extremely strong indicator that we're in some level of AI bubble because it just doesn't make any sense at all.
I think it's enthusiasm from SpaceX delivering Starlink at a sensible price, and Starship looking like it's probably going to be fully reusable and bring prices down further.
Given Musk's behaviour on the world stage… I wouldn't bet on SpaceX being allowed to allow him on-premises after 2028, let alone direct the company and get it to deliver the price goals he's suggested in various places.
Cooling is conspicuously absent other than a brief mention in the conclusion. As if it has been redacted, because it’s such an obvious and hard problem in space. Which leads me to believe they’ve made progress and aren’t sharing that for competitive reasons. There’s an extremely strong incentive for SpaceX to put GPU on board their birds for local SDR processing power, for applications like SIGINT, high channel counts, etc, and the cooling is literally the only impediment.
In fact everything in this paper is already solved by SpaceX except GPU cooling.
> Cooling is conspicuously absent other than a brief mention in the conclusion.
It's not absent - it's covered in the paper, which this blog release summarizes. There's a link to the paper itself in the side bar.
> In fact everything in this paper is already solved by SpaceX except GPU cooling.
Cooling is already solved by SpaceX too, since this paper basically starts with the idea of swapping out whatever payload is on Starlink with power-equivalent in TPUs, and then goes from there.
It will require a number of innovations just to solve the formation flying aspect of the system, not to mention the other challenges (listed and not)... good luck with that.
Right, but they're flying them close on purpose - point is, at first glance it looks feasible and the close formation aspect has enough benefits that it's worth exploring further. For me, it's the first time I saw the idea to exploit constellations for benefit within the system (here, communication between satellites), and not externally (synthetic aperture telescopes/beaming, or just more = lower orbit = cheaper).
The ultimate "out of sight out of mind" solution to a problem?
I'm surprised that Google has drunken the "Datacenters IN SPACE!!!1!!" kool-aid. Honestly I expected more.
It's so easy to poke a hole in these systems that it's comical. Answer just one question: How/why is this better than an enormous solar-powered datacenter in someplace like the middle of the Mojave Desert?
From the post they claim 8 times more solar energy and no need for batteries because they are continuously in the sun. Presumably at some scale and some cost/kg to orbit this starts to pencil out?
Think to any near-future spacecraft, or idea for spaceships cruising between Earth and the Moon or Mars, that aren't single use. What are (will be) such spacecraft? Basically data centers with some rockets glued to the floor.
It's probably not why they're interested in it, but I'd like to imagine someone with a vision for the next couple decades realized that their company already has data centers and powering them as their core competency, and all they're missing is some space experience...
I think the atmosphere absorbs something like 25% of energy. If that's correct, you get a free 33% increase in compute by putting more compute behind a solar power in LEO
And you can pretty much choose how long you want your day to be (within limits). The ISS has a sunrise every 90 minutes. A ~45 minute night is obviously much easier to bridge with batteries than the ~12 hours of night in the surface. And if you spend a bunch more fuel on getting into a better orbit you even get perpetual sunlight, again more than doubling your energy output (and thermal challenges)
I have my doubts that it's worth it with current or near future launch costs. But at least it's more realistic than putting solar arrays in orbit and beaming the power down
> How/why is this better than an enormous solar-powered datacenter in someplace like the middle of the Mojave Desert?
Night.
I mean, how good an idea this actually is depends on what energy storage costs, how much faster PV degrades in space than on the ground, launch costs, how much stuff can be up there before a Kessler cascade, if ground-based lasers get good enough to shoot down things in whatever orbit this is, etc., but "no night unless we want it" is the big potential advantage of putting PV in space.
More stupidity coming out of the corporate self-promoters which inhabit Google. Not surprised to see Blaise Agüera y Arcas on the paper; he did a TED talk in 2016 trumpeting a bunch of Google image generation research which, as far as I can tell, was not related to him at all. No related published research. Put the real researchers in the TED talks, please.
This is dual-use technology for the weapon systems needed for Golden Dome. Engineers should be wary when they're getting asked to work on things that don't make economic sense.
> In the right orbit, a solar panel can be up to 8 times more productive than on earth, and produce power nearly continuously, reducing the need for batteries.
Sure. Now do cooling. That this isn't in the "key challenges" section makes this pretty non-serious.
A surprising amount of the ISS is dedicated to this, and they aren't running a GPU farm. https://en.wikipedia.org/wiki/External_Active_Thermal_Contro...
Barely mentioning thermal management seems at odds with the X principle of "Don’t use up all your resources on the easy stuff": https://blog.x.company/tackle-the-monkey-first-90fd6223e04d
This is absolutely the first thing I looked for too. They just barely mentioned thermal management at all. Maybe they know something I don't, but I know from past posts here that many people share this concern. Very strange that they didn't go there, or maybe they didn't go there because they have no solution and this is just greenwashing for the costs of AI.
No, they just literally assumed their design fits withing the operational envelope of a conventional satellite - the paper (which no one read, apparently) literally says their system design "assumes a relatively conventional, discrete compute payload, satellite bus, thermal radiator, and solar panel designs".
This is not the 1960s. Today, if you have an idea for doing something in space, you can start by scoping out the details of your mission plan and payload requirements, and then see if you can solve it with parts off a catalogue.
(Of course there's million issues that will crop up when actually designing and building the spacecraft, but that's too low level for this kind of paper, which just notes that (the authors believe) the platform requirements fall close enough to existing systems to not be worth belaboring.)
3 replies →
How much are you ready to bet against Elon's plans to scale up Starlink v3 for GPUs? Starlink v3 already has a 60M length solar array, so they're already solving dissipation for that size. Assume linear scaling to many thousands of modules.
From https://x.com/elonmusk/status/1984249048107508061:
"Simply scaling up Starlink V3 satellites, which have high speed laser links would work. SpaceX will be doing this."
From https://x.com/elonmusk/status/1984868748378157312:
"Starship could deliver 100GW/year to high Earth orbit within 4 to 5 years if we can solve the other parts of the equation. 100TW/year is possible from a lunar base producing solar-powered AI satellites locally and accelerating them to escape velocity with a mass driver."
> How much are you ready to bet against Elon's plans to scale up Starlink v3 for GPUs?
I'm sure they'll be ready right after the androids and the robotaxi and the autonomous LA-NYC summoning.
> Starlink v3 already has a 60M length solar array, so they're already solving dissipation for that size.
Starlink v3 doesn't exist yet. They're renders at this point. Full-sized v2s haven't even flown yet, just mass simulators.
https://en.wikipedia.org/wiki/Starlink#Satellite_revisions
6 replies →
Cooling area seems similar to generation area, so maybe less than a key challenge?
GPT says 1000 W at 50 C takes about 3 m^2 to radiate (edge on to Earth and Sun), and generating that 1000 W takes about... 3 m^2 of solar panel. The panel needs its backside radiator clear to keep itself coolish (~100 C), so it does need to be a separate surface. Spreading a 1000 W point source across a 3 m^2 tile (or half that if two-sided?) is perhaps not scary, even with weight constraints?
Hmm, from an order-of-magnitude perspective, it looks like an (L shaped) Starlink v2 sat has 100 m^2 of panel, low 10 kW draw, and a low 100 m^2 body area. And there are 10 k of them. So want something bigger. A 100 x 100 m sheet might get you 10 sats per 100,000 GPU data center.
Regards ISS, ISS has its big self, basking in the sunlight, needing to be cooled. Versus "the only thing sun-lit is panel".
Just run your AI calculations on your favorite Cryoarithmetic Engine, no problem.
Point solar panels away from the Sun and they work as rudimentary radiators :).
More seriously though, the paper itself touches on cooling and radiators. Not much, but that's reasonable - cooling isn't rocket science :), it's a solved problem. Talking about it here makes as much sense as taking about basic attitude control. Cooling the satellite and pointing it in the right direction are solved problems. They're important to detail in full system design, but not interesting enough for a paper that's about "data centers, but in space!".
Cooling at this scale in space is very much not a solved problem. Some individual datacenter racks use more power than the entire ISS cooling system can handle.
It's solved on Earth because we have relatively easy (and relatively scalable) ways of getting rid of it - ventilation and water.
15 replies →
The article doesn’t even have the word “heat” in it.
The linked paper does.
that's easy - just put everything right behind the solar panels /s
I'm completely puzzled on why space-based compute is so exciting to everyone all of a sudden. I have worked on spacecraft and the constant power benefit seems comically far from outweighing the many, many negatives, even if launch cost is zero, which we are still very far from.
Am I missing something? Feels like an extremely strong indicator that we're in some level of AI bubble because it just doesn't make any sense at all.
I think it's enthusiasm from SpaceX delivering Starlink at a sensible price, and Starship looking like it's probably going to be fully reusable and bring prices down further.
Given Musk's behaviour on the world stage… I wouldn't bet on SpaceX being allowed to allow him on-premises after 2028, let alone direct the company and get it to deliver the price goals he's suggested in various places.
Since LLM results aren't trustworthy anyways, what's a few bit flips amongst friends?
Cooling is conspicuously absent other than a brief mention in the conclusion. As if it has been redacted, because it’s such an obvious and hard problem in space. Which leads me to believe they’ve made progress and aren’t sharing that for competitive reasons. There’s an extremely strong incentive for SpaceX to put GPU on board their birds for local SDR processing power, for applications like SIGINT, high channel counts, etc, and the cooling is literally the only impediment.
In fact everything in this paper is already solved by SpaceX except GPU cooling.
> Cooling is conspicuously absent other than a brief mention in the conclusion.
It's not absent - it's covered in the paper, which this blog release summarizes. There's a link to the paper itself in the side bar.
> In fact everything in this paper is already solved by SpaceX except GPU cooling.
Cooling is already solved by SpaceX too, since this paper basically starts with the idea of swapping out whatever payload is on Starlink with power-equivalent in TPUs, and then goes from there.
It will require a number of innovations just to solve the formation flying aspect of the system, not to mention the other challenges (listed and not)... good luck with that.
Right, but they're flying them close on purpose - point is, at first glance it looks feasible and the close formation aspect has enough benefits that it's worth exploring further. For me, it's the first time I saw the idea to exploit constellations for benefit within the system (here, communication between satellites), and not externally (synthetic aperture telescopes/beaming, or just more = lower orbit = cheaper).
What sort of formation are you thinking of? They’re all going to be hugging the terminator, like a big merry go round.
The ultimate "out of sight out of mind" solution to a problem?
I'm surprised that Google has drunken the "Datacenters IN SPACE!!!1!!" kool-aid. Honestly I expected more.
It's so easy to poke a hole in these systems that it's comical. Answer just one question: How/why is this better than an enormous solar-powered datacenter in someplace like the middle of the Mojave Desert?
From the post they claim 8 times more solar energy and no need for batteries because they are continuously in the sun. Presumably at some scale and some cost/kg to orbit this starts to pencil out?
You're trading an 8x smaller low-maintenance solid-state solar field for a massive probably high-maintenance liquid-based radiator field.
2 replies →
No infrastructure, no need for security, no premises, no water.
I think it's a good idea, actually.
8 replies →
Think to any near-future spacecraft, or idea for spaceships cruising between Earth and the Moon or Mars, that aren't single use. What are (will be) such spacecraft? Basically data centers with some rockets glued to the floor.
It's probably not why they're interested in it, but I'd like to imagine someone with a vision for the next couple decades realized that their company already has data centers and powering them as their core competency, and all they're missing is some space experience...
Sure, if you don't mind boiling the passengers.
2 replies →
I think the atmosphere absorbs something like 25% of energy. If that's correct, you get a free 33% increase in compute by putting more compute behind a solar power in LEO
And you can pretty much choose how long you want your day to be (within limits). The ISS has a sunrise every 90 minutes. A ~45 minute night is obviously much easier to bridge with batteries than the ~12 hours of night in the surface. And if you spend a bunch more fuel on getting into a better orbit you even get perpetual sunlight, again more than doubling your energy output (and thermal challenges)
I have my doubts that it's worth it with current or near future launch costs. But at least it's more realistic than putting solar arrays in orbit and beaming the power down
> How/why is this better than an enormous solar-powered datacenter in someplace like the middle of the Mojave Desert?
Night.
I mean, how good an idea this actually is depends on what energy storage costs, how much faster PV degrades in space than on the ground, launch costs, how much stuff can be up there before a Kessler cascade, if ground-based lasers get good enough to shoot down things in whatever orbit this is, etc., but "no night unless we want it" is the big potential advantage of putting PV in space.
More stupidity coming out of the corporate self-promoters which inhabit Google. Not surprised to see Blaise Agüera y Arcas on the paper; he did a TED talk in 2016 trumpeting a bunch of Google image generation research which, as far as I can tell, was not related to him at all. No related published research. Put the real researchers in the TED talks, please.
This is dual-use technology for the weapon systems needed for Golden Dome. Engineers should be wary when they're getting asked to work on things that don't make economic sense.
Data centers in space are guaranteed to be a thing by 2035.
https://x.com/elonmusk/status/1984868748378157312
https://x.com/elonmusk/status/1985743650064908694
https://x.com/elonmusk/status/1984249048107508061
0.5% of the starlink node network deorbits each month currently, though potentially more.
They're already having a negative, contaminating effect on our upper atmosphere
Sending up bigger ones, and more (today there's some 8,800, but they target 30k), sounds ill-advised.
1: https://www.fastcompany.com/91419515/starlink-satellites-are... 2: https://www.science.org/content/article/burned-satellites-ar...
However 10 years in Musk time is at least 30 years in real time
Had me going for a minute there.
Poe's Law strikes again!