← Back to context

Comment by emh68

5 months ago

Sometimes I think about the bizarre path computer technology took.

For instance, long-term storage. It would stand to reason that we'd invent some kind of big electrical array, and that's the best we could hope for. But hard drive technology (which relies on crazy materials technology for the platter and magnets, crazy high-precision encoders, and crazy physics like floating a tiny spring over the air bubble created by the spinning platter) came in and blew all other technology away.

And, likewise, we had liquid crystal technology since the 70s, and probably could have invented it sooner, but no need, because Cathode Ray Tube technology appeared (a mini particle accelerator in your home! Plus the advanced materials science to bore the precision electron beam holes in the screen grid, the phosphor coating, the unusual deflection coil winding topology, and leaded glass to reduce x-ray expose for the viewers) and made all other forms of display unattractive by comparison.

It's amazing how far CRT technology got, given its disconnect from other technologies. The sophistication of the factories that created late-model "flat-screen" CRTs is truly impressive.

The switch to LCDs/LEDs was in a lot of ways a step back. Sure, we don't have huge 40lb boxes on our desks, but we lost the ultra-fast refresh rate enabled by the electron beam, not to mention the internal glow that made computers magical (maybe I'm just an old fuddy-duddy, like people in the 80s who swore that vinyl records "sounded better").

Someday, maybe given advances in robotics and automation, I hope to start a retro CRT manufacturing company. The problems, such as the unavailability of the entire supply chain (can't even buy an electron gun, it would have to be made from scratch) and environmental restrictions (lead glass probably makes the EPA perk up and notice).

> like people in the 80s who swore that vinyl records "sounded better"

I'm not one of those people who ever thought vinyl sounded better than a properly recorded and mastered digital version and I've always believed a high-bandwidth digital audio signal chain can recreate the "warmth" and other artifacts of tube compressors well beyond the threshold of human perception, however a broadcast-quality, high-definition CRT being fed a pristine hi-def analog RGB signal can still create some visuals which current flat screens can't. This is only controversial because most people have never seen that kind of CRT because they were incredibly rare.

I got to see one of the broadcast production CRTs made to support NHK's analog high-definition video format in the 90s directly connected to HD broadcast studio cameras and the image quality was simply sensational. It was so much better than even the best consumer CRT TVs, that it was simply another thing entirely. Of course, it cost $40,000 and only a few dozen were ever made but it was only that expensive because these were prototypes made years before digital hi-def would be standardized and begin mass production.

In fact, I think if it was A/B compared next to a current high-end consumer flat screen, a lot of people would say that CRT looks more pleasing and overall better. For natural imagery a CRT could render the full fidelity and sharpness of a 1080 image but without that over-crisp 'edginess' today's high-end flat screens get. And those "cathode rays" can render uniquely rich and deep colors vs diodes and crystals. Of course, for synthetic images like computer interfaces and high-dpi text, a flat screen can be better but for natural imagery, we lost something which hasn't yet been replaced. I'd love to see an ultra high-end CRT like that designed to display modern uncompressed 4K 12-bit HDR digital video.

  • I had a music teacher that insisted analog recordings were different.

    One day she said there is a simple way to prove it. Certain stringed instruments have the string move on their own to the correct note if you put them near a source of similar sound. If you put these instruments in front of a speaker playing from an analog source and have the strings move, then play the exact same music but from a digital source on the same speaker, the strings stop moving, even if to most humans it sounds exactly the same.

    Sadly I never had the gear to test this, I am not a professional musician and was learning from that person as a hobby (she is a teacher for professional musicians).

    • If you do ever test this, and do it rigorously (i.e. using analogue and digital versions of the same recording, with no pitch inaccuracies) you'll find the strings will resonate equally well with analogue and digital recordings, all other things (volume, tuning of the instrument, etc.) being equal.

      2 replies →

    • I find this dubious since the effect she was describing is caused by resonance frequency. Since, in the example provided, the source is an amplified speaker pushing air in both cases the outcome should be the same. The more famous test of this principle is the breaking of a glass and I would be surprised if this hadn't been done with digital signal inputs.

      1 reply →

  • Have you looked at any high end OLEDs lately?

    • Yes, my background is in broadcast video engineering. An edit suite I'm in regularly has a $10,000 24-inch BVM-E251 reference monitor for color grading. At home I have a $4,000 LG C5 OLED. My dedicated home theater room is based around a $12,000 4k laser projector. I also own a Sony BVM series broadcast CRT, various Trinitrons CRTs and a retro gaming arcade cabinet built around a 25-inch analog RGB industrial CRT (Wells Gardner D9200). I use an optical colorimeter for display calibration.

      All of these displays are unique tools each with differing capabilities. I own and use them all for what they are best at. Flat panel technologies can produce incredible visuals with certain strengths no CRT can replicate (when properly calibrated and given a high-quality source signal). However, the reverse is also true, extremely high-end cathode ray technology, with an appropriately high definition dot/shadow mask and phosphors, can generate visuals with traits no current flat panel display technology can duplicate. To be clear, I'm not talking about any CRT consumer television you've ever seen. A decent OLED display of today can look far better than even the best 1990s televisions, but consumer televisions were standard definition and hard-limited to less than 6 MHz of signal bandwidth (usually much less), so any comparison between the fundamental display technologies ('cathode ray-irradiated phosphors' vs 'light emitting diodes') is meaningless if not evaluated with the same resolution and bandwidth input signal. And you've never seen a high-definition CRT like a KW-3600HD fed with a 30 MHz HD source signal. But they exist and I've seen one.

      Everything in display engineering involves trade-offs. CRTs and light emitting diodes are based on different materials with fundamentally different optical properties and underlying physics. Each has their own unique strengths. Neither can fully replicate the entire range of the other in every respect. This is not a personal aesthetic opinion, it's a carefully qualified technical assessment based on objective measurement and it's consistent with the physical capabilities of the respective technologies.

I think what's even more interesting is how CRTs evolved. Conceptually it was like starting from an incandescent bulb to single LED to seven segment display and then to LCD. The progression from neon bulb up to HD CRT tubes is pretty much linear! We started with "magic eye" tubes, a sort of radial bar graph, and then tubes that could raster a single line and we used them for oscilloscopes. Then monochrome 2D raster and then more and more complex color raster systems.

It's pretty neat how smoothly the technology progressed through every intermediate step. There weren't many huge revolutionary leaps, just steady progress

Imo OLED has completely eclipsed CRT by now.

I don't know enough to say where CRTs could be today if they had gotten the development $ that went into other tech. But to be as good as OLEDs they would have had to find something else than phosphor as the inner coating.

For response times, CRT will always remain the king of dark-to-light response times, but afterglow for bright-to-dark would always be a factor unless a different coating was developed. OLEDs have no such issues. Subjectively, the claimed < 0.1 ms response times are real and there are zero artifacts, no afterglow, no ghosts, just extremely sharp and defined motion.

> It's amazing how far CRT technology got

And China is still building, today, brand new CRT boards for CRT TVs and monitors. You can buy them on AliExpress.

I don't know if CRT themselves are still being built though.

I'm hanging on to my vintage arcade cab from the 80s with its still-working huge CRT screen. Hope I fail before that thing (and I hope it doesn't fail anytime soon!).

> The switch to LCDs/LEDs was in a lot of ways a step back. Sure, we don't have huge 40lb boxes on our desks, but we lost the ultra-fast refresh rate enabled by the electron beam, not to mention the internal glow that made computers magical (maybe I'm just an old fuddy-duddy, like people in the 80s who swore that vinyl records "sounded better").

CRTs don't have particularly good refresh rates. There is very little delay on the output scan, but 99% of the time the delays built into rendering make that irrelevant compared to fast screens using other technologies. And the time between scans doesn't go very low.

I have no idea what you mean by internal glow.