← Back to context

Comment by crote

10 hours ago

That's pretty much a solved problem. We've had geostationary constellations for TV broadcast at hundreds of megabytes for decades now, and lasers for sat-to-sat comms seems to be making decent progress as well.

> it is possible to put 500 to 1000 TW/year of AI satellites into deep space, meaningfully ascend the Kardashev scale and harness a non-trivial percentage of the Sun’s power

Which satellites are operating from "deep space"?

  • Those are for video. AI Chat workflows use a fraction of the data.

    • That's silly on so many levels.

      1. the latency is going to be insane.

      2. AI video exists.

      3. vLLMa exist and take video and images as input.

      4. When a new model checkpoint needs to go up, are we supposed to wait months for it to transfer?

      5. A one million token context window is ~4MB. That's a few milliseconds terrestrially. Assuming zero packet loss, that's many seconds

      6. You're not using TCP for this because the round trip time is so high. So you can't cancel any jobs if a user disconnects.

      7. How do you scale this? How many megabits has anyone actually ever successfully sent per second over the distances in question? We literally don't know how to get a data center worth of throughput to something not in our orbit, let alone more than double digit megabits per second.

      1 reply →