Comment by gosub100

2 years ago

digital TV never worked as well as analog. in my opinion, the switch should have never been justified until it could be qualitatively proven that digital > analog. by "greater than" / better I mean not interrupting the viewing experience, especially the audio. This test would be done using stock antennae within reasonable distance from the transmitter. or even better, actually ask users which they prefer: UHF analog or digital. don't switch until 2/3 or more prefer digital. I've never consistently watched DTV, because inevitably a disruption will come and block the audio for about 1.5s, and completely freeze the video. It's simply a waste of time.

> digital TV never worked as well as analog. in my opinion, the switch should have never been justified until it could be qualitatively proven that digital > analog. by "greater than" / better I mean not interrupting the viewing experience, especially the audio.

I'm not sure I agree with you. Audiowise, you may be right, but video wise, it took a lot to have near perfect video receiving, without ghosts and other weirdness, whereas if you've got a comfortable margin from the digital cliff, you can get an uninterrupted picture and audio, and it will be as good as it gets.

Now, when someone at the station decides they should stuff 8 subchannels of 1080i over the 20Mbps carrier with static multiplexing, that's going to look awful. Dynamic multiplexing helps, but doesn't work miracles either. If the broadcaster does 1 HD stream with about 12-15Mbps, it can look pretty good, as long as it's not flowing water or Olympic diving, one or two, maaaaybe three SD subchannels for the rest of the bandwidth is ok too.

If you don't have a comfortable margin, it is much worse though. Analog TV audio was usually pretty decent even with a very snowy picture. And then there's the delays in tuning to a new channel.

The big difference to me about analog vs digital broadcast is that analog could receive part of the signal and display the poor video and then cleaned up the image/sound as the signal was dialed in. With digital, you're either receiving the stream of 1s&0s or your not. If you miss enough, you have no signal to decode.

I met an old timer satellite dish installer (big giant dish types) that used this to align newly installed dishes. He knew the location of a specific satellite and the frequency it was broadcasting one. He had a tuner dialed in for that channel. Once he found it, he'd move along the one axis to count the number of signals he'd pass until he got to the satellite meant for receiving. He'd then swap out to the receiver for that signal to verify.

The loss of analog just made the playing and experimenting much less fun

  • "analog" used high-power horizontal and vertical sync pulses. These were hard for the receiver to lose / mess up, if there was any signal at all. Put another way, the receiver would receive the sync pulses (aka 'blacker than black) before even a shred of the video could be decoded. Another way to say this: The sync pulses had such good SNR compared with the picture content, so when the picture content was even barely visible, it was solidly in sync.

    • There's a lot about the old analog video signal that was just fun to me like that. The fact the color signal could be received by a b&w was cool, except for now with hindsight being 20/20 we now have to deal with the ramifications of that cool. Part of it lingered into HD, but we're finally getting rid of most of that baggage with 4K.

      Things like the color burst, 1 volt peak-to-peak, whiter than white, how the video signal that was too hot could interfere with the audio that was multiplexed into the RF signal, how a signal could cause the picture to distort when not within spec. Just all sorts of things that were fun to mess with

In the video of the full interview (linked below the main video), he explains that some of the formats he uses overcome those disruptions, where there is a weak signal.