Comment by adrian_b
13 hours ago
I am willing to bet that more than 10% of the electrical energy consumed by the satellite is converted into transmitted microwaves.
There must be many power consumers in the satellite, e.g. radio receivers, lasers, computers and motors, where the consumed energy eventually is converted into heat, but the radio transmitter of a communication satellite must take a big fraction of the average consumed power.
The radio transmitter itself has a great efficiency, much greater than 50%, possibly greater than 90%, so only a small fraction of the electrical power consumed by the transmitter is converted into heat and most is radiated in the microwave signal that goes to Earth's surface.
Unfortunately this is not the case. The amplifiers on the transmit-side phased arrays are about 10% efficient (perhaps 12% on a good day), but the amps represent only ~half the power consumption of the transmit phased arrays. The beamformers and processors are 0% efficient. The receive-side phased arrays are of course 0% efficient as well.
I'm curious. I think the whole thing (space-based compute) is infeasible and stupid for a bunch of reasons, but even a class-A amplifier has a theoretical limit of 50% efficiency, and I thought we used class-C amplifiers (with practical efficiencies above 50%) in FM/FSK/etc. applications in which amplitude distortion can be filtered away. What makes these systems be down at 10%?
Yes, a 10% efficiency is very weird if true.
Nowadays such microwave power amplifiers should be made with gallium nitride transistors, which should allow better efficiencies than the ancient amplifiers using LDMOS or travelling-wave tubes, and even those had efficiencies over 50%.
For beamformers, there have been research papers in recent years claiming a great reduction in losses, but presumably the Starlink satellites are still using some mature technology, with greater losses.