Comment by jedberg

11 hours ago

This is interesting. John Logie Baird did in fact demonstrate something that looked like TV, but the technology was a dead end.

Philo Farnsworth demonstrated a competing technology a few years later, but every TV today is based on his technology.

So, who actually invented Television?

For what it’s worth, Philo Farnsworth and John Logie Baird were friendly with each other. I was lucky to know Philo’s wife Pem very well in the last part of her life, and she spoke highly of Baird as a person.

David Sarnoff and RCA was an entirely different matter, of course…

Whatever we all television now, television then was literally "vision at a distance", which Baird was the first to demonstrate (AFAIK).

The TV I have now in my living room is closer to a computer than a television from when I grew up (born 1975) anyway, so the word could mean all sorts of things. I mean, we still call our pocket computers "phones" even though they are mainly used for viewing cats at a distance.

You should read about the invention of color television. There were two competing methods, one of which depended on a spinning wheel with colored filters in it. If I remember correctly, you needed something like a 10-foot wheel to have a 27-inch TV.

Sure enough, this was the system selected as the winner by the U.S. standard-setting body at the time. Needless to say, it failed and was replaced by what we ended up with... which still sucked because of the horrible decision to go to a non-integer frame rate. Incredibly, we are for some reason still plagued by 29.97 FPS long after the analog system that required it was shut off.

  • Why is an integer frame rate better?

    • For one thing, it’s much easier to measure spans of time when you have an integer frame rate. For example, 1 hour at 30fps is exactly 108,000 frames, but at 29.97 it’s only 107,892 frames. Since frame numbers must all have an integer time code, “drop-frame” time code is used, where each second has a variable number of frames so that by the end of each measured hour the total elapsed time syncs up with the time code, i.e. “01:00:00;00” falls after exactly one hour has passed. This is of course crucial when scheduling programs, advertisements, and so on. It’s a confusing mess and historically has caused all kinds of headaches for the TV industry over the years.

  • Originally you had 30fps, it was the addition of colour with the NTSC system that dropped it to 30000/1001fps. That wasn't a decision taken lightly -- it was a consequence of retrofitting colour onto a black and white system while maintaining backward compatibility.

    When the UK (and Europe) went colour it changed to a whole new system and didn't have to worry too much about backward compatibility. It had a higher bandwidth (8mhz - so 33% more than NTSC), and was broadcasting on new channels separate to the original 405 lines. It also had features like alternating the phase of every other line to reduce the "tint" or "never twice the same color" problem that NTSC had

    America chose 30fps but then had to slow it by 1/1001ths to avoid interference.

    Of course because by the 90s and the growth of digital, there was already far too much stuff expecting "29.97"hz so it remained, again for backward compatibility.

    • An engineer at RCA in New Jersey told me that at the first(early) NTSC color demo the interference was corrected by hand tweaking the color sub-carrier oscillator from which vertical and horizontal intervals were derived and the final result was what we got.

      The interference was caused when the spectrum of the color sub-carrier over-lapped the spectrum of the horizontal interval in the broadcast signal. Tweaking the frequencies allowed the two spectra to interleave in the frequency domain.

    • In the UK the two earliest channels (BBC1 and ITV) continued to broadcast in the 405 line format (in addition to PAL) until 1985. Owners of ancient televisions had 20 years to upgrade. That doesn't seem unreasonable.

    • understanding the affect of the 1.001 fix has given me tons of job security. That understanding came not from just book learning, but OJT from working in a film/video post house that had engineers, colorists, and editors that were all willing to entertain a young college kid's constant use of "why?". Then being present for the transition from editing film on flat beds to editing film transfers to video. Part of that came from having to transfer audio from tape reels to video by changing to the proper 59.94Hz or 60Hz crystal that was needed to control the player's speed. Also had a studio DAT deck that could slow down the 24fps audio record in the field to playback at 23.976.

      Literally, to this day, am I dealing with all of these decisions made ~100 years ago. The 1.001 math is a bit younger when color was rolled out, but what's a little rounding between friends?

I had a communications theory class in college that addressed "vestigal sideband modulation," which I believe was implemented by Farnsworth. I think this is a critical aspect to the introduction of television technology.

https://en.wikipedia.org/wiki/Single-sideband_modulation#Sup...

  • VSB came later. From https://www.tvtechnology.com/opinions/hdtv-from-1925-to-1994

    In the United States in 1935, the Radio Corporation of America demonstrated a 343-line television system. In 1936, two committees of the Radio Manufacturers Association (RMA), which is now known as the Consumer Electronics Association, proposed that U.S. television channels be standardized at a bandwidth of 6 MHz, and recommended a 441-line, interlaced, 30 frame-per-second television system. The RF modulation system proposed in this recommendation used double-sideband, amplitude-modulated transmission, limiting the video bandwidth it was capable of carrying to 2.5 MHz. In 1938, this RMA proposal was amended to employ vestigial-sideband (VSB) transmission instead of double sideband. In the vestigial-sideband approach, only the upper sidebands-those above the carrier frequency-plus a small segment or vestige of the lower sidebands, are transmitted. VSB raised the transmitted video bandwidth capability to 4.2 MHz. Subsequently, in 1941, the first National Television Systems Committee adopted the vestigial sideband system using a total line rate of 525 lines that is used in the United States today.

"The Last Lone Inventor: A Tale of Genius, Deceit, and the Birth of Television" is a great book detailing the Farnsworth journey.

The thing is that "television" seemed like a thing but really it was a system that required a variety of connected, compatible parts, like the Internet.

Different pieces of what became TV existed in 1900, the challenge was putting them together. And that required a consensus among powerful players.

Baird did. Farnsworth invented the all-electric version (sans mechanical parts).

A kin to Ed Roberts, John Blakenbaker and Mark Dean invented the personal computer but Apple invented the PC as we know it.

> but every TV today is based on his technology.

Philo Farnsworth invented the cathode ray tube. unless you're writing this from the year 2009 or before, I'm going to have to push back on the idea that tv's TODAY are based on his technology. They most certainly are not.