← Back to context

Comment by hirsin

21 hours ago

Good point - the comms satellites are not even "keeping" some of the energy, while a DC would. I _am_ now curious about the connection between bandwidth and wattage, but I'm willing to bet that less than 1% of the total energy dissipation on one of these DC satellites would be in the form of satellite-to-earth broadcast (keeping in mind that s2s broadcast would presumably be something of a wash).

I am willing to bet that more than 10% of the electrical energy consumed by the satellite is converted into transmitted microwaves.

There must be many power consumers in the satellite, e.g. radio receivers, lasers, computers and motors, where the consumed energy eventually is converted into heat, but the radio transmitter of a communication satellite must take a big fraction of the average consumed power.

The radio transmitter itself has a great efficiency, much greater than 50%, possibly greater than 90%, so only a small fraction of the electrical power consumed by the transmitter is converted into heat and most is radiated in the microwave signal that goes to Earth's surface.

  • Unfortunately this is not the case. The amplifiers on the transmit-side phased arrays are about 10% efficient (perhaps 12% on a good day), but the amps represent only ~half the power consumption of the transmit phased arrays. The beamformers and processors are 0% efficient. The receive-side phased arrays are of course 0% efficient as well.

    • I'm curious. I think the whole thing (space-based compute) is infeasible and stupid for a bunch of reasons, but even a class-A amplifier has a theoretical limit of 50% efficiency, and I thought we used class-C amplifiers (with practical efficiencies above 50%) in FM/FSK/etc. applications in which amplitude distortion can be filtered away. What makes these systems be down at 10%?

      1 reply →