← Back to context

Comment by mrandish

1 month ago

It's worth deep diving into how analog composite broadcast television works, because you quickly realize just how insanely ambitious it was for 1930s engineers to have not only conceived, but perfected and shipped at consumer scale using only 1930s technologies.

Being old enough to have learned video engineering at the end of the analog days, it's kind of fun helping young engineers today wrap their brains around completely alien concepts, like "the image is never pixels" then "it's never digital" and "never quantized." Those who've been raised in a digital world learn to understand things from a fundamentally digital frame of reference. Even analog signals are often reasoned about as if their quantized form was their "true nature".

Interestingly, I suspect the converse would be equally true trying to explain digital television to a 1930s video engineer. They'd probably struggle similarly, always mentally remapping digital images to their "true" analog nature. The fundamental nature of their world was analog. Nothing was quantized. Even the idea "quanta" might be at the root of physics was newfangled, suspect and, even if true, of no practical use in engineering systems.

Yes agreed! And while it is not quantized as such there is an element of semi-digital protocol to it. The concept of "scanline" is quantized and there's "protocols" for indicating when a line ends, and a picture ends etc. that the receiver/send needs to agree on... and "colorbursts packets" for line, delay lines and all kinds of clever technique etc. so it is extremely complicated. Many things were necessary to overcome distortion and also to ensure backwards compatibility - first, how do you fit in the color so a monochrome TV can still show it? Later, how do you make it 16:9 and it can still show on a 4:3 TV (which it could!).

  • > And while it is not quantized as such there is an element of semi-digital protocol to it.

    Yes, before posting I did debate that exact point in my head, with scanlines as the clearest example :-). However, I decided the point is still directionally valid because ultimately most timing-centric analog signal encoding has some aspect of being quantized, if only to thresholds. Technically it would be more correct to narrow my statement about "never quantized" to the analog waveform driving the electron gun as it sweeps horizontally across a line. It always amazes digital-centric engineers weaned on pixels when they realize the timing of the electron gun sweep in every viewer's analog TV was literally created by the crystal driving the sweep of the 'master' camera in the TV studio (and would drift in phase with that crystal as it warmed up!). It's the inevitable consequence of there being no practical way to store or buffer such a high frequency signal for re-timing. Every component in the chain from the cameras to switchers to transmitters to TVs had to lock to the master clock. Live TV in those days was truly "live" to within 63.5 microseconds of photons hitting vacuum tubes in the camera (plus the time time it took for the electrons to move from here to there). Today, "live" HDTV signals are so digitally buffered, re-timed and re-encoded at every step on their way to us, we're lucky if they're within 20 seconds of photons striking imagers.

    My larger point though was that in the 1930s even that strict signal timing had to be encoded and decoded purely with discrete analog components. I have a 1950s Predicta television and looking at the components on the boards one can't help wondering "how the hell did they come up with this crazy scheme." Driving home just how bonkers the whole idea of analog composite television was for the time.

    > first, how do you fit in the color so a monochrome TV can still show it?

    To clarify for anyone who may not know, analog television was created in the 1930s as a black-and-white composite standard defined by the EIA in the RS-170 specification, then in 1953 color was added by a very clever hack which kept all broadcasts backward compatible with existing B&W TVs (defined in the RS-170A specification). Politicians mandated this because they feared nerfing all the B&W TVs owned by voters. But that hack came with some significant technical compromises which complicated and degraded color analog video for over 50 years.

    • Yes knew what you meant, and fully agree. It is fascinating TV is even possible just out of all these rather simple and bulky analog components. Even the first color TV's were with vacuum tubes and no transitors.

      As I recall there's all kinds of hacks in the design to keep them cheap. For instance, letting the fly-back transformer for producing the high voltages needed operate at the same frequency as the horizontal scan rate (~15 kHz) so that mechanism essentially serves double duty. The same was even seen in microcomputers where the same crystal needed for TV was also used for the microprocessor - meaning that e.g. a "European" Commodore 64 with PAL was actually a few percent slower than an American C64 with NTSC. And other crazy things like that.

      1 reply →

It's interesting how early digital video systems were influenced by the analog aspects. DVDs were very much still defined by NTSC/PAL even though the data is fully digital.

  • Indeed and even today's HDTV specification has elements based on echoes reverberating all the way from decisions made in the 1930s when specifying B&W TV.

    The composite and component sampling rates (14.32 MHz and 13.5 MHz) are both based on being 4x a specific existing color carrier sampling rate from analog television. And those two frequencies directly dictated all the odd-seeming horizontal pixel resolutions we find in pre-HD digital video (352, 704, 360, 720 and 768) and even the original PC display resolutions (CGA, VGA, XGA, etc).

    For example, the 720 horizontal pixels of DVD and digital satellite broadcasts was tied to the digital component video standard sampling the active picture area of an analog video scanline at 13.5 Mhz to capture the 1440 clock transitions in that waveform. Similarly, 768 (another common horizontal resolution in pre-HD video) is tied to the composite video standard sampling at 14.32 MHz to capture 1536 clock transitions. The history of how these standards were derived is fascinating (https://tech.ebu.ch/docs/techreview/trev_304-rec601_wood.pdf)

    VGA's horizontal resolution of 640 is simply from adjusting analog video's rectangular aspect ratio to be square (720 * 0.909 = 640). It's kind of fascinating all these modern digital resolutions can be traced back to decisions made in the 1930s based on which affordable analog components were available, which competing commercial interests prevailed (RCA vs Philco) and the political sensitivities present at the time.