← Back to context

Comment by Etheryte

6 hours ago

It's a radio telescope, how would you imagine translating that to bytes?

Are you deliberately obtuse to the play on words of an array being used from a programmer's use of the word in contrast to an array of antennas?

Every sensor in the array is sampling at frequency, so - first order - you can use that sampling frequency and the sample size, you get an idea of the input bandwidth in bytes/second. There are of course bandwidth reduction steps (filtering, downsampling, beamforming)...

  • This makes no sense though? Given the Nyquist theorem, simply increasing sampling frequency past a certain step doesn't change the outcome.