Comment by sosomoxie
14 days ago
CRTs are peak steam punk technology. Analog, electric, kinda dangerous. Just totally mindblowing that we had these things in our living rooms shooting electric beams everywhere. I doubt it's environmentally friendly at all, but I'd love to see some new CRTs being made.
There's a synchronous and instantaneous nature you don't find in modern designs.
The image is not stored at any point. The receiver and the transmitter are part of the same electric circuit in a certain sense. It's a virtual circuit but the entire thing - transmitter and receiving unit alike - are oscillating in unison driven by a single clock.
The image is never entirely realized as a complete thing, either. While slow phosphor tubes do display a static image, most CRT systems used extremely fast phosphors; they release the majority of the light within a millisecond of the beam hitting them. If you take a really fast exposure of a CRT display (say 1/100,000th of a second) you don't see the whole image on the photograph - only the most recently few drawn lines glow. The image as a whole never exists at the same time. It exists only in the persistence of vision.
> The image is not stored at any point.
Just wanted to add one thing, not as a correction but just because I learned it recently and find it fascinating. PAL televisions (the color TV standard in Europe) actually do store one full horizontal scanline at a time, before any of it is drawn on the screen. This is due to a clever encoding used in this format where the TV actually needs to average two successive scan lines (phase-shifted compared to each other) to draw them. Supposedly this cancels out some forms of distortion. It is quite fascinating this was even possible with analogue technology. The line is stored in a delay line for 64 microseconds. See e.g.: https://www.youtube.com/watch?v=bsk4WWtRx6M
At some point, most NTSC TVs had delay lines, too. A comb filter was commonly used for separating the chroma from the luma, taking advantage of the chroma phase being flipped each line. Sophisticated comb filters would have multiple delay lines and logic to adaptively decide which to use. Some even delayed a whole field or frame, so you could say that in this case one or more frames were stored in the TV.
https://www.extron.com/article/ntscdb3
1 reply →
I only knew about SECAM, where it’s even part of the name (Système Électronique Couleur Avec Mémoire)
2 replies →
The physical components of those delay lines were massive crystals with silver electrodes grafted on to them. Very interesting component.
All PAL TVs had a delay line in them? Crazy.
1 reply →
It doesn’t begin at the transmitter either, in the earliest days even the camera was essentially part of the same circuit. Yes, the concept of filming a show and showing the film over the air existed eventually, but before that (and even after that, for live programming) the camera would scan the subject image (actors, etc) line-by-line and down a wire to the transmitter which would send it straight to your TV and into the electron beam.
In fact in order to show a feed of only text/logos/etc in the earlier days, they would literally just point the camera at a physical object (like letters on a paper, etc) and broadcast from the camera directly. There wasn’t really any other way to do it.
Our station had an art department that used a hot press to create text boards that were set on an easel that had a camera pointed at it. By using a black background with white text you could merge the text camera with a camera in the studio and "super-imposed the text into the video feed.
"And if you tell the kids that today, they won't believe it!"
3 replies →
>>> The image is not stored at any point.
The very first computers (Manchester baby) used CRTs as memory - the ones and zeros were bright spots on a “mesh” and the electric charge on the mesh was read and resent back to the crt to keep the ram fresh (a sorta self refreshing ram)
Yes, but those were not the standard kind of CRTs that are used in TV sets and monitors.
The CRTs with memory for early computers were actually derived from the special CRTs used in video cameras. There the image formed by the projected light was converted in a distribution of charge stored on an electrode, which was then sensed by scanning with an electron beam.
Using CRTs as memory has been proposed by von Neumann and in his proposal he used the appropriate name for that kind of CRT: "iconoscope".
Why didn't that catch on pre-transistor? Feels like you'd get higher density than valves and relays.
1 reply →
Yeah it super weird that while we struggle with latency in the digital world, storing anything for any amount of time is an almost impossible challenge in the analog world.
You should check out:
- Core memory - Drum memory - Bubble memory - Mercury delay line memory - Magnetic type memory :P
And probably many more. Remember that computers don't even need to be digital!
2 replies →
It's worth deep diving into how analog composite broadcast television works, because you quickly realize just how insanely ambitious it was for 1930s engineers to have not only conceived, but perfected and shipped at consumer scale using only 1930s technologies.
Being old enough to have learned video engineering at the end of the analog days, it's kind of fun helping young engineers today wrap their brains around completely alien concepts, like "the image is never pixels" then "it's never digital" and "never quantized." Those who've been raised in a digital world learn to understand things from a fundamentally digital frame of reference. Even analog signals are often reasoned about as if their quantized form was their "true nature".
Interestingly, I suspect the converse would be equally true trying to explain digital television to a 1930s video engineer. They'd probably struggle similarly, always mentally remapping digital images to their "true" analog nature. The fundamental nature of their world was analog. Nothing was quantized. Even the idea "quanta" might be at the root of physics was newfangled, suspect and, even if true, of no practical use in engineering systems.
Yes agreed! And while it is not quantized as such there is an element of semi-digital protocol to it. The concept of "scanline" is quantized and there's "protocols" for indicating when a line ends, and a picture ends etc. that the receiver/send needs to agree on... and "colorbursts packets" for line, delay lines and all kinds of clever technique etc. so it is extremely complicated. Many things were necessary to overcome distortion and also to ensure backwards compatibility - first, how do you fit in the color so a monochrome TV can still show it? Later, how do you make it 16:9 and it can still show on a 4:3 TV (which it could!).
> And while it is not quantized as such there is an element of semi-digital protocol to it.
Yes, before posting I did debate that exact point in my head, with scanlines as the clearest example :-). However, I decided the point is still directionally valid because ultimately most timing-centric analog signal encoding has some aspect of being quantized, if only to thresholds. Technically it would be more correct to narrow my statement about "never quantized" to the analog waveform driving the electron gun as it sweeps horizontally across a line. It always amazes digital-centric engineers weaned on pixels when they realize the timing of the electron gun sweep in every viewer's analog TV was literally created by the crystal driving the sweep of the 'master' camera in the TV studio (and would drift in phase with that crystal as it warmed up!). It's the inevitable consequence of there being no practical way to store or buffer such a high frequency signal for re-timing. Every component in the chain from the cameras to switchers to transmitters to TVs had to lock to the master clock. Live TV in those days was truly "live" to within 63.5 microseconds of photons hitting vacuum tubes in the camera (plus the time time it took for the electrons to move from here to there). Today, "live" HDTV signals are so digitally buffered, re-timed and re-encoded at every step on their way to us, we're lucky if they're within 20 seconds of photons striking imagers.
My larger point though was that in the 1930s even that strict signal timing had to be encoded and decoded purely with discrete analog components. I have a 1950s Predicta television and looking at the components on the boards one can't help wondering "how the hell did they come up with this crazy scheme." Driving home just how bonkers the whole idea of analog composite television was for the time.
> first, how do you fit in the color so a monochrome TV can still show it?
To clarify for anyone who may not know, analog television was created in the 1930s as a black-and-white composite standard defined by the EIA in the RS-170 specification, then in 1953 color was added by a very clever hack which kept all broadcasts backward compatible with existing B&W TVs (defined in the RS-170A specification). Politicians mandated this because they feared nerfing all the B&W TVs owned by voters. But that hack came with some significant technical compromises which complicated and degraded color analog video for over 50 years.
2 replies →
It's interesting how early digital video systems were influenced by the analog aspects. DVDs were very much still defined by NTSC/PAL even though the data is fully digital.
Indeed and even today's HDTV specification has elements based on echoes reverberating all the way from decisions made in the 1930s when specifying B&W TV.
The composite and component sampling rates (14.32 MHz and 13.5 MHz) are both based on being 4x a specific existing color carrier sampling rate from analog television. And those two frequencies directly dictated all the odd-seeming horizontal pixel resolutions we find in pre-HD digital video (352, 704, 360, 720 and 768) and even the original PC display resolutions (CGA, VGA, XGA, etc).
For example, the 720 horizontal pixels of DVD and digital satellite broadcasts was tied to the digital component video standard sampling the active picture area of an analog video scanline at 13.5 Mhz to capture the 1440 clock transitions in that waveform. Similarly, 768 (another common horizontal resolution in pre-HD video) is tied to the composite video standard sampling at 14.32 MHz to capture 1536 clock transitions. The history of how these standards were derived is fascinating (https://tech.ebu.ch/docs/techreview/trev_304-rec601_wood.pdf)
VGA's horizontal resolution of 640 is simply from adjusting analog video's rectangular aspect ratio to be square (720 * 0.909 = 640). It's kind of fascinating all these modern digital resolutions can be traced back to decisions made in the 1930s based on which affordable analog components were available, which competing commercial interests prevailed (RCA vs Philco) and the political sensitivities present at the time.
I was on a course at Sony in San Mateo in the 1980s and they had a 36" prototype television in the corner. We all asked for it to be turned on. We were told by the instructor that he was not allowed to turn it on because the 40,000V anode voltage generated too many X-rays at the front of the picture tube.
:-))))
And perhaps peak atompunk too when used as RAM. [0]
[0] https://en.wikipedia.org/wiki/Williams_tube
Damn, what I wouldn't give to be able to look at my computer and see the bits bobbing in its onboard ram
Like the MegaProcessor? [0]
[0] https://www.youtube.com/watch?v=lNa9bQRPMB8
1 reply →
One summer odd-job included an afternoon of throwing a few dozen CRTs off a 3rd floor balcony into a rolloff dumpster. I'da done it for free.
People pay for that these days in smash rooms.
Rock and roll!
Extra dangerous aspect: On really early CRTs they hadn't quite nailed the glass thicknesses. One failure mode was that the neck that held the electron gun would fail. This would propell the gun through the front of the screen, possibly toward the viewer.
I don't know, "Killed by electron gun breakdown" sounds like a rad way to go. You can replace "electron gun" with "particle accelerator" if you want.
Likewise, a dropped CRT tube was a constant terror for TV manufacturing and repair folks, as it likely would implode and send zillions of razor-sharp fragments airborne.
My high school science teacher used to share anecdotes from his days in electrical repair.
He said his coworkers would sometimes toss a television capacitor at each other as a prank.
Those capacitors retained enough charge to give the person unlucky enough to catch one a considerable jolt.
4 replies →
I remember smashing a broken monitor as a kid for fun, hearing about the implosion stuff, and sadly found the back of the glass was stuck to some kind of plastic film that didnt allow the pieces to fly about :(
I can't still get over how we used to put them straight in our faces, yet I never knew of someone having an accidental face reshaping ever.
That doesn't match my experience of deliberately dropping an old CRT monitor off the roof. Implosions are unfortunately not as exciting as explosions.
2 replies →
What do you mean "had"? I just turned mine off a minute ago. I am yet to make the transition to flat screen TVs but in the mean time, at least no-one's tracking my consumer habits.
Not through your TV, but they see you driving to the last Blockbuster tho
I wish.
While not entirely thematically unrelated, being electric puts it distinctly outside of steampunk and even dieselpunk. I don't think anyone would call The Matrix steampunk but CRTs are at the center of its aesthetic. Cassette Futurism is the correct term I believe though it also overlaps with some sub-genres of cyberpunk.
With CRTs, the environmental problem is the heavy metals: tons of lead in the glass screen, plus cadmium and whatnot. Supposedly there can be many pounds of lead in a large CRT.
Yes - and x-rays too! Both from the main TV tube itself (though often shielded) but historically the main problem was actually the vacuum rectifiers used to generate the high voltages required. Those vacuum tubes essentially became x-ray bulbs and had to be shielded. This problem appeared as the first color TV's appeared in the late 60s. Color required higher voltages for the same brightness, due to the introduction of a mask that absorbed a lot of the energy. As a famous example, certain GE TV's would emit a strong beam of x-rays, but it was downwards so it would mostly expose someone beneath the TV. Reportedly a few models could emit 50,000 mR/hr at 9 inches distance https://www.nytimes.com/1967/07/22/archives/owners-of-9000-c... which is actually quite a lot (enough for radiation sickness after a few hours). All were recalled of course!
The shadow mask system for colour CRTs was a huge improvement that thwarted worries about ''beams everywhere'':
https://en.wikipedia.org/wiki/Shadow_mask
Actually, the voltages had to be raised due to the shadow mask, and this rise in voltage meant you were now in x-ray territory, which wasn't the case before. The infamous problems with TV's emitting x-rays and associate recalls were the early color TV's. And it wasn't so much from the tube, but from the shunt regulators etc. in the power supply that were themselves vacuum tubes. If you removed the protection cans around those you would be exposed to strong radiation. Most of that went away when the TV's were transistorized so the high-voltage circuits didn't involve vacuum tubes.
Most of those old TVs were not Faraday Caged either, nor were they grounded to earth, so all that radiation and energy was one hardware failure away from seriously unfunny events. Their chassis grounding always gave a tingle to the touch.
Try antialias with that bad boy
The 1940-1990 era of technology can't be beat. Add hard drives and tape to the mix. What happened to electromechanical design? I doubt it would be taught anymore. Everything is solid state
Solid state is the superior technology for almost everything. No moving parts means more reliable, quieter, and very likely more energy efficient since no mass has to move.
Do modern hdd's last as long as the old platter ones? For me, when the SSDs fail it's frustrating because I can't open it up and do anything about it--it's a complete loss. So I tend to have a low opinion of their reliability (same issue I have with old versus new electronic-everything cars). I don't know the actual lifetimes. Surely USB sticks are universally recognized as pretty crappy. I can leave those in the same location plugged in and they'll randomly die after a couple of years.
2 replies →
That and modern digital TV is just incredibly boring from the technical standpoint. Because everything is a computer these days, it's just some MPEG-2 video. The only thing impressive about it is that they managed to squeeze multiple channels worth of video streams into the bandwidth of one analog channel.
Also, I believe precursors to CRT existed in the 19th century. What was unique with television was the creation of a full CRT system that allowed moving picture consumption to be a mass phenomena.
We're getting awfully close to recreating CRT qualities with modern display panels. A curved 4:3 1000Hz OLED panel behind glass, and an integrated RetroTink 4K with G-Sync Pulsar support would do it. Then add in a simulated degauss effect and electrical whine and buzzing sounds for fun.
still can't play duck hunt on it though.
Yes you can, see https://neslcdmod.com/
It basically mods the rom to allow for a bit more latency when checking the hit targets
>1000 Hz
This sounds like a brute force solution over just having the display controller read the image as it is being sent and emulating the phosphors.
A 1000 Hz panel does not imply that the computer has to send 1000 frames per second.
Whoops, I misremembered. G-Sync Pulsar works with a 360Hz panel, claims perceived motion clarity comparable to 1000Hz+.
Why curved? We didn't like the CRT curvature back then and manufacturers struggled to make them as flat as possible, finally reaching "virtually flat" screens towards the end of the CRT era. I have one right here on my desk, a Sony Multiscan E200.
This thread makes me realise that the old Telequipment D61 Cathode Ray Oscilloscope I have is worth hanging on to. It's basically a CRT with signal conditioning on its inputs, including a "Z mod" input, making it easy to do cool stuff with it.
This is a cool little project you might be interested in - https://github.com/mausimus/ShaderGlass
'Steampunk' means no electricity. You need to come up with another term. Analogpunk, maybe?
"Dieselpunk" is sometimes considered the next door neighbor term for WW1 through early 1950's retrofuturism with electricity and radios/very early televisions.
Sometimes people use "Steampunk" for shorthand for both because there are some overlaps in either direction, especially if you are trying for "just" pre-WWI retrofuture. Though I think the above poster was maybe especially trying to highlight the sort of pre-WWI overlap with Steampunk with more electricity but not yet as many cars and "diesel".
https://en.wikipedia.org/wiki/Dieselpunk
I don't know. Steam and electricity seem more like a coincidence that they were developed at the same time, so worlds without one seem natural. Another possibility might be no semiconductors. No nuclear also feels plausible, but it's just not interesting. Anything else requires a massive stretch to explain why technology got stuck in such a state.
2 replies →
[dead]