Comment by retrac

14 days ago

There's a synchronous and instantaneous nature you don't find in modern designs.

The image is not stored at any point. The receiver and the transmitter are part of the same electric circuit in a certain sense. It's a virtual circuit but the entire thing - transmitter and receiving unit alike - are oscillating in unison driven by a single clock.

The image is never entirely realized as a complete thing, either. While slow phosphor tubes do display a static image, most CRT systems used extremely fast phosphors; they release the majority of the light within a millisecond of the beam hitting them. If you take a really fast exposure of a CRT display (say 1/100,000th of a second) you don't see the whole image on the photograph - only the most recently few drawn lines glow. The image as a whole never exists at the same time. It exists only in the persistence of vision.

> The image is not stored at any point.

Just wanted to add one thing, not as a correction but just because I learned it recently and find it fascinating. PAL televisions (the color TV standard in Europe) actually do store one full horizontal scanline at a time, before any of it is drawn on the screen. This is due to a clever encoding used in this format where the TV actually needs to average two successive scan lines (phase-shifted compared to each other) to draw them. Supposedly this cancels out some forms of distortion. It is quite fascinating this was even possible with analogue technology. The line is stored in a delay line for 64 microseconds. See e.g.: https://www.youtube.com/watch?v=bsk4WWtRx6M

  • At some point, most NTSC TVs had delay lines, too. A comb filter was commonly used for separating the chroma from the luma, taking advantage of the chroma phase being flipped each line. Sophisticated comb filters would have multiple delay lines and logic to adaptively decide which to use. Some even delayed a whole field or frame, so you could say that in this case one or more frames were stored in the TV.

    https://www.extron.com/article/ntscdb3

    • If a motion adaptive 3d comb filter (which requires comparing successive frames) was present on a TV, you can bet that it would be plastered all over the marketing material for the TV.

  • I only knew about SECAM, where it’s even part of the name (Système Électronique Couleur Avec Mémoire)

    • You can decode a PAL signal without any memory, the memory is only needed to correct for phase errors. In SECAM though, it's a hard requirement because the two color components, Db and Dr, are transmitted on alternating lines, and you need both on each line.

      1 reply →

  • The physical components of those delay lines were massive crystals with silver electrodes grafted on to them. Very interesting component.

It doesn’t begin at the transmitter either, in the earliest days even the camera was essentially part of the same circuit. Yes, the concept of filming a show and showing the film over the air existed eventually, but before that (and even after that, for live programming) the camera would scan the subject image (actors, etc) line-by-line and down a wire to the transmitter which would send it straight to your TV and into the electron beam.

In fact in order to show a feed of only text/logos/etc in the earlier days, they would literally just point the camera at a physical object (like letters on a paper, etc) and broadcast from the camera directly. There wasn’t really any other way to do it.

  • Our station had an art department that used a hot press to create text boards that were set on an easel that had a camera pointed at it. By using a black background with white text you could merge the text camera with a camera in the studio and "super-imposed the text into the video feed.

    "And if you tell the kids that today, they won't believe it!"

>>> The image is not stored at any point.

The very first computers (Manchester baby) used CRTs as memory - the ones and zeros were bright spots on a “mesh” and the electric charge on the mesh was read and resent back to the crt to keep the ram fresh (a sorta self refreshing ram)

  • Yes, but those were not the standard kind of CRTs that are used in TV sets and monitors.

    The CRTs with memory for early computers were actually derived from the special CRTs used in video cameras. There the image formed by the projected light was converted in a distribution of charge stored on an electrode, which was then sensed by scanning with an electron beam.

    Using CRTs as memory has been proposed by von Neumann and in his proposal he used the appropriate name for that kind of CRT: "iconoscope".

  • Why didn't that catch on pre-transistor? Feels like you'd get higher density than valves and relays.

    • DRAM memories made with special CRTs with memory have been used for a few years, until 1954. For instance the first generation of commercial electronic computers made by IBM (scientific IBM 701 and business-oriented IBM 702) have used such CRTs.

      Then the CRT memories have become obsolete almost instantaneously, due to the development of magnetic core memories, which did not require periodic refreshing and which were significantly faster. The fact that they were also non-volatile was convenient at that early time, though not essential.

      Today, due to security concerns, you would actually not want for your main memory to be non-volatile, unless you also always encrypt it completely, which creates problems of secret key management.

      So CRT memories have become obsolete several years before the replacement of vacuum tubes in computers with transistors, which happened around 1959/1960.

      Besides CRT memories and delay line memories, another kind of early computer memory that has quickly become obsolete was the memory with magnetic drums.

      In the cheapest early computers (like IBM 650), the main memory was not a RAM (i.e. neither a CRT nor with magnetic cores), but a magnetic drum memory (i.e. with sequential periodic access to data).

Yeah it super weird that while we struggle with latency in the digital world, storing anything for any amount of time is an almost impossible challenge in the analog world.