← Back to context

Comment by mrandish

25 days ago

> And while it is not quantized as such there is an element of semi-digital protocol to it.

Yes, before posting I did debate that exact point in my head, with scanlines as the clearest example :-). However, I decided the point is still directionally valid because ultimately most timing-centric analog signal encoding has some aspect of being quantized, if only to thresholds. Technically it would be more correct to narrow my statement about "never quantized" to the analog waveform driving the electron gun as it sweeps horizontally across a line. It always amazes digital-centric engineers weaned on pixels when they realize the timing of the electron gun sweep in every viewer's analog TV was literally created by the crystal driving the sweep of the 'master' camera in the TV studio (and would drift in phase with that crystal as it warmed up!). It's the inevitable consequence of there being no practical way to store or buffer such a high frequency signal for re-timing. Every component in the chain from the cameras to switchers to transmitters to TVs had to lock to the master clock. Live TV in those days was truly "live" to within 63.5 microseconds of photons hitting vacuum tubes in the camera (plus the time time it took for the electrons to move from here to there). Today, "live" HDTV signals are so digitally buffered, re-timed and re-encoded at every step on their way to us, we're lucky if they're within 20 seconds of photons striking imagers.

My larger point though was that in the 1930s even that strict signal timing had to be encoded and decoded purely with discrete analog components. I have a 1950s Predicta television and looking at the components on the boards one can't help wondering "how the hell did they come up with this crazy scheme." Driving home just how bonkers the whole idea of analog composite television was for the time.

> first, how do you fit in the color so a monochrome TV can still show it?

To clarify for anyone who may not know, analog television was created in the 1930s as a black-and-white composite standard defined by the EIA in the RS-170 specification, then in 1953 color was added by a very clever hack which kept all broadcasts backward compatible with existing B&W TVs (defined in the RS-170A specification). Politicians mandated this because they feared nerfing all the B&W TVs owned by voters. But that hack came with some significant technical compromises which complicated and degraded color analog video for over 50 years.

Yes knew what you meant, and fully agree. It is fascinating TV is even possible just out of all these rather simple and bulky analog components. Even the first color TV's were with vacuum tubes and no transitors.

As I recall there's all kinds of hacks in the design to keep them cheap. For instance, letting the fly-back transformer for producing the high voltages needed operate at the same frequency as the horizontal scan rate (~15 kHz) so that mechanism essentially serves double duty. The same was even seen in microcomputers where the same crystal needed for TV was also used for the microprocessor - meaning that e.g. a "European" Commodore 64 with PAL was actually a few percent slower than an American C64 with NTSC. And other crazy things like that.

  • > "European" Commodore 64 with PAL was actually a few percent slower than an American C64 with NTSC. And other crazy things like that.

    Indeed! Even in the Playstation 2 era, many games still ran at different speeds in Europe than the U.S. and Japan. There were so many legacy artifacts which haunted computers, games, DVDs and more for decades after analog broadcast was supplanted by digital. And it all arose from the fact the installed base and supporting broadcast infrastructure of analog television was simply too massive to replace. In a way it was one of the biggest accrued "technical debts" ever!

    The only regrettable thing is during the long, painful transition from analog to digital, a generation of engineers got the idea that the original analog TV standard was somehow bad - which, IMHO, is really unfair. The reality is the original RS-170 standard was a brilliant solution which perfectly fulfilled, and even exceeded, all its intended use cases for decades. The problems only arose when that solution was kept alive far beyond its intended lifetime and then hacked to support new use cases like color encoding while maintaining backward compatibility.

    Analog television was created solely for natural images captured on vacuum tube cameras. Even the concept of synthetic imagery like character generator text and computer graphic charts was still decades in the future. Then people who weren't yet born when TV was created, began to shove poorly converted, hard-edged, low-res, digital imagery into a standard created to gracefully degrade smooth analog waveforms and it indeed sucked. I learned to program on an 8-bit computer with 4K of RAM connected to a Sears television through an RF modulator. Even 32 columns of 256x192 text was a blurry mess with color fringes! On many early 8-bit computers, some colors would invert randomly based on which clock phase the computer started on! Red would be blue and vice versa so we'd have to repeatedly hit reset until the colors looked correct. But none of that craziness was the fault of the original television engineers, we were abusing what they created in ways they couldn't have imagined.