← Back to context

Comment by dylan604

2 years ago

The big difference to me about analog vs digital broadcast is that analog could receive part of the signal and display the poor video and then cleaned up the image/sound as the signal was dialed in. With digital, you're either receiving the stream of 1s&0s or your not. If you miss enough, you have no signal to decode.

I met an old timer satellite dish installer (big giant dish types) that used this to align newly installed dishes. He knew the location of a specific satellite and the frequency it was broadcasting one. He had a tuner dialed in for that channel. Once he found it, he'd move along the one axis to count the number of signals he'd pass until he got to the satellite meant for receiving. He'd then swap out to the receiver for that signal to verify.

The loss of analog just made the playing and experimenting much less fun

"analog" used high-power horizontal and vertical sync pulses. These were hard for the receiver to lose / mess up, if there was any signal at all. Put another way, the receiver would receive the sync pulses (aka 'blacker than black) before even a shred of the video could be decoded. Another way to say this: The sync pulses had such good SNR compared with the picture content, so when the picture content was even barely visible, it was solidly in sync.

  • There's a lot about the old analog video signal that was just fun to me like that. The fact the color signal could be received by a b&w was cool, except for now with hindsight being 20/20 we now have to deal with the ramifications of that cool. Part of it lingered into HD, but we're finally getting rid of most of that baggage with 4K.

    Things like the color burst, 1 volt peak-to-peak, whiter than white, how the video signal that was too hot could interfere with the audio that was multiplexed into the RF signal, how a signal could cause the picture to distort when not within spec. Just all sorts of things that were fun to mess with