CRTs used to be cheap because they were made in high volumes and had a large ecosystem of parts suppliers. If you were to make a CRT today, you'd need to fabricate a lot more parts yourself, and the low volume production would require charging very high prices. You'd also have to deal with more stringent environmental laws, as CRTs contain many toxins, including large amounts of lead.
It's much cheaper to emulate CRT effects so that they work with any display technology. Modern LCDs and OLEDs have fast enough response times that you can get most CRT effects (and omit the ones you dislike, such as refresh flicker). And you don't have to deal with a heavy, bulky display that can implode and send leaded glass everywhere.
Unfortunately, the flicker is essential for the excellent motion quality CRTs are renowned for. If the image on the screen stays constant while you eyes are moving, the image formed on your retina is blurred. Blurbusters has a good explanation:
CRT phosphors light up extremely brightly when the electron beam hits them, then exponentially decay. Non-phosphor-based display technologies can attempt to emulate this by strobing a backlight or lighting the pixel for only a fraction of the frame time, but none can match this exponential decay characteristic of a genuine phosphor. I'd argue that the phosphor decay is the most important aspect of the CRT look, more so than any static image quality artifacts.
There is such a thing as a laser-powered phosphor display, which uses moving mirrors to scan lasers over the phosphors instead of an electron beam, but AFAIK this is only available as modules intended for building large outdoor displays:
But why would the flicker be considered "excellent motion quality"?
In real life, there's no flicker. Motion blur is part of real life. Filmmakers use the 180-degree shutter rule as a default to intentionally capture the amount of motion blur that feels natural.
I can understand why the CRT would reduce the motion blur, in the same way that when I super-dim an LED lamp at night and wave my hand, I see a strobe effect instead of smooth motion, because the LED is actually flickering on and off.
But I don't understand why this would ever be desirable. I view it as a defect of dimmed LED lights at night, and I view it as an undesirable quality of CRT's. I don't understand why anyone would call that "excellent motion quality" as opposed to "undesirable strobe effect".
Or for another analogy, it's like how in war and action scenes in films they'll occasionally switch to a 90-degree shutter (or something less than 180) to reduce the motion blur to give a kind of hyper-real sensation. It's effective when used judiciously for a few shots, but you'd never want to watch a whole movie like that.
You should be able to emulate close to CRT beam scanout + phosphor decay given high enough refresh rates.
Eg. given a 30 Hz (60i) retro signal, a 480 Hz display has 16 full screen refreshes for each input frame, while a 960 Hz display has 32. 480 Hz already exists, and 960 Hz are expected by end of the decade.
You essentially draw the frame over and over with progressive darkening of individual scan lines to emulate phosphor decay.
In practice, you'd want to emulate the full beam scanout and not even wait for full input frames in order to reduce input lag.
Mr. Blurbuster himself has been pitching this idea for awhile, as part of the software stack needed once we have 960+ Hz displays to finally get CRT level motion clarity. For example:
Is there actually a fundamental physical limit in modern (O)LED displays not being able to emulate that “flicker”, or is merely that all established display driver boards are unable to do it because it isn’t a mainstream requirement? If so, it would still be much cheaper to make an FPGA-powered board that drives a modern panel to “simulate” (in quotes because it may not be simulating, instead merely avoiding to compensate for by avoiding the artificial persistence) the flicker than bootstrapping a modern CRT supply chain?
Looking at that Dallibor Farney company and how hard it is for them to get new nixie tubes to be a sustainable business, I shudder to think how much more effort it would be to get new, high quality CRTs off the ground. It would be cool though. A good start might be bringing back tube rebuilding more widely.
I think it's one of these things that people like to talk about in the abstract, but how many people really want a big CRT taking up space in their home?
Modern OLED displays are superior in every way and CRT aesthetics can be replicated in software, so a more practical route would be probably to build some "pass-through" device that adds shadow mask, color bleed, and what-have-you. A lot cheaper than restarting the production of cathode-ray tubes.
I recently bought a big CRT to take up space in my home.
Yes, of course, "objectively" speaking, an OLED display is superior. It has much better blacks and just better colors with a much wider gamut in general. But there's just something about the way a CRT looks - the sharp contrast between bleeding colors and crisp subpixels, the shadows that all fade to gray, the refresh flicker, the small jumps the picture sometimes makes when the decoding circuit misses an HBLANK - that's hard to replicate just in software. I've tried a lot of those filters, and it just doesn't come out the same. And even if it did look as nice, it would never be as cool.
Retro gaming has to be retro. And to be honest, the CRT plays Netflix better as well. It doesn't make you binge, you see? Because it's a little bit awful, and the screen is too small, and you can't make out the subtitles if you sit more than two meters away from the screen, and you can't make out anything if you sit closer than that.
Does that mean we have to restart the production of cathode-ray tubes? Hopefully not. But you can't contain the relics of an era in a pass-through device from jlcpcb.
If the display is working and the input layout isn't changing, you shouldn't accept any jumps at all. If the sync signals are coming at the same rate, the display should remain steady. (Well - as steady as you get with a CRT.) If they don't: it's broken.
> Modern OLED displays are superior in every way and CRT aesthetics can be replicated in software, so a more practical route would be probably to build some "pass-through" device that adds shadow mask, color bleed, and what-have-you.
OLEDs are still behind on motion clarity, but getting close. We finally have 480 Hz OLEDs, and seem to be on track to the 1000Hz needed to match CRTs.
The Retrotink 4k also exists as a standalone box to emulate CRTs and is really great. The main problem being it's HDMI 2.0 output, so you need to choose between 4k60 output with better resolution to emulate CRT masks/scan lines, or 1440p120 for better motion clarity.
Something 4k500 or 4k1000 is likely needed to really replace CRTs completely.
Really hoping by the time 1000 Hz displays are common we do end up with some pass-through box that can fully emulate everything. Emulating full rolling CRT gun scan out should be possible at that refresh rate, which would be amazing.
1000Hz is enough to match CRT quality on a sample-and-hold display, but only when you're displaying 1000fps content. A great many games are limited to 60fps, which means you'll need to either interpolate motion, which adds latency and artifacts, or insert black frames (or better, black lines for a rolling scan, which avoids the latency penalty), which reduces brightness. Adding 16 black frames between every image frame is probably going to reduce brightness to unacceptable levels.
You should probably watch one of the old films about how CRTs were made. It's not a simple process and basically would require setting up a whole factory to mass produce them.
Hobbyist-level production of monochrome TV tubes is possible, but a big effort. Some of the early television restorers have tried.[1] Color, though, is far more complicated. A monochrome CRT just has a phosphor coating inside the glass. A color tube has photo-etched patterns of dots aligned with a metal shadow mask.
CRT rebuilding, where the neck is cut off, a new electron gun installed, and the tube re-sealed and evacuated, used to be part of the TV repair industry. That can be done in a small-scale workshop.
There's a commercial business which still restores CRTs.[2] Most of their work is restoring CRTs for old military avionics systems. But there are a few Sony and Panasonic models for which they have parts and can do restoration.
A practical thing about costs is likely shipping. There aren't many consumer products that would be more costly to move around, so you're looking at something as messy as a fridge to sell at the high end.
I imagine one could target smaller CRTs as an idea though.
I know there have been conversations here about simulating crt subpixels on hidpi displays. There are some games that used subpixel rendering to achieve better antialiasing. With hidpi you at least have a chance of doing it well.
CRTs used to be cheap because they were made in high volumes and had a large ecosystem of parts suppliers. If you were to make a CRT today, you'd need to fabricate a lot more parts yourself, and the low volume production would require charging very high prices. You'd also have to deal with more stringent environmental laws, as CRTs contain many toxins, including large amounts of lead.
It's much cheaper to emulate CRT effects so that they work with any display technology. Modern LCDs and OLEDs have fast enough response times that you can get most CRT effects (and omit the ones you dislike, such as refresh flicker). And you don't have to deal with a heavy, bulky display that can implode and send leaded glass everywhere.
Unfortunately, the flicker is essential for the excellent motion quality CRTs are renowned for. If the image on the screen stays constant while you eyes are moving, the image formed on your retina is blurred. Blurbusters has a good explanation:
https://blurbusters.com/faq/oled-motion-blur/
CRT phosphors light up extremely brightly when the electron beam hits them, then exponentially decay. Non-phosphor-based display technologies can attempt to emulate this by strobing a backlight or lighting the pixel for only a fraction of the frame time, but none can match this exponential decay characteristic of a genuine phosphor. I'd argue that the phosphor decay is the most important aspect of the CRT look, more so than any static image quality artifacts.
There is such a thing as a laser-powered phosphor display, which uses moving mirrors to scan lasers over the phosphors instead of an electron beam, but AFAIK this is only available as modules intended for building large outdoor displays:
https://en.wikipedia.org/wiki/Laser-powered_phosphor_display
But why would the flicker be considered "excellent motion quality"?
In real life, there's no flicker. Motion blur is part of real life. Filmmakers use the 180-degree shutter rule as a default to intentionally capture the amount of motion blur that feels natural.
I can understand why the CRT would reduce the motion blur, in the same way that when I super-dim an LED lamp at night and wave my hand, I see a strobe effect instead of smooth motion, because the LED is actually flickering on and off.
But I don't understand why this would ever be desirable. I view it as a defect of dimmed LED lights at night, and I view it as an undesirable quality of CRT's. I don't understand why anyone would call that "excellent motion quality" as opposed to "undesirable strobe effect".
Or for another analogy, it's like how in war and action scenes in films they'll occasionally switch to a 90-degree shutter (or something less than 180) to reduce the motion blur to give a kind of hyper-real sensation. It's effective when used judiciously for a few shots, but you'd never want to watch a whole movie like that.
6 replies →
You should be able to emulate close to CRT beam scanout + phosphor decay given high enough refresh rates.
Eg. given a 30 Hz (60i) retro signal, a 480 Hz display has 16 full screen refreshes for each input frame, while a 960 Hz display has 32. 480 Hz already exists, and 960 Hz are expected by end of the decade.
You essentially draw the frame over and over with progressive darkening of individual scan lines to emulate phosphor decay.
In practice, you'd want to emulate the full beam scanout and not even wait for full input frames in order to reduce input lag.
Mr. Blurbuster himself has been pitching this idea for awhile, as part of the software stack needed once we have 960+ Hz displays to finally get CRT level motion clarity. For example:
https://github.com/libretro/RetroArch/issues/6984
2 replies →
Is there actually a fundamental physical limit in modern (O)LED displays not being able to emulate that “flicker”, or is merely that all established display driver boards are unable to do it because it isn’t a mainstream requirement? If so, it would still be much cheaper to make an FPGA-powered board that drives a modern panel to “simulate” (in quotes because it may not be simulating, instead merely avoiding to compensate for by avoiding the artificial persistence) the flicker than bootstrapping a modern CRT supply chain?
2 replies →
72Hz is already a huge improvement in flicker from 60Hz though, and certainly maintains excellent motion quality.
1 reply →
And even then, they weren’t that cheap, or at least good ones weren’t. Even with the benefit of mass production, this one cost $40k in today’s money.
No, it's $100,000 in today's money
Source @1:59: https://m.youtube.com/watch?v=JfZxOuc9Qwk&t=119
Looking at that Dallibor Farney company and how hard it is for them to get new nixie tubes to be a sustainable business, I shudder to think how much more effort it would be to get new, high quality CRTs off the ground. It would be cool though. A good start might be bringing back tube rebuilding more widely.
Also, see the visit to one of the last CRT refurbishing facilities out there: https://m.youtube.com/watch?v=YqGaEM9sjVg
I think it's one of these things that people like to talk about in the abstract, but how many people really want a big CRT taking up space in their home?
Modern OLED displays are superior in every way and CRT aesthetics can be replicated in software, so a more practical route would be probably to build some "pass-through" device that adds shadow mask, color bleed, and what-have-you. A lot cheaper than restarting the production of cathode-ray tubes.
I recently bought a big CRT to take up space in my home.
Yes, of course, "objectively" speaking, an OLED display is superior. It has much better blacks and just better colors with a much wider gamut in general. But there's just something about the way a CRT looks - the sharp contrast between bleeding colors and crisp subpixels, the shadows that all fade to gray, the refresh flicker, the small jumps the picture sometimes makes when the decoding circuit misses an HBLANK - that's hard to replicate just in software. I've tried a lot of those filters, and it just doesn't come out the same. And even if it did look as nice, it would never be as cool.
Retro gaming has to be retro. And to be honest, the CRT plays Netflix better as well. It doesn't make you binge, you see? Because it's a little bit awful, and the screen is too small, and you can't make out the subtitles if you sit more than two meters away from the screen, and you can't make out anything if you sit closer than that.
Does that mean we have to restart the production of cathode-ray tubes? Hopefully not. But you can't contain the relics of an era in a pass-through device from jlcpcb.
If the display is working and the input layout isn't changing, you shouldn't accept any jumps at all. If the sync signals are coming at the same rate, the display should remain steady. (Well - as steady as you get with a CRT.) If they don't: it's broken.
> Modern OLED displays are superior in every way and CRT aesthetics can be replicated in software, so a more practical route would be probably to build some "pass-through" device that adds shadow mask, color bleed, and what-have-you.
OLEDs are still behind on motion clarity, but getting close. We finally have 480 Hz OLEDs, and seem to be on track to the 1000Hz needed to match CRTs.
The Retrotink 4k also exists as a standalone box to emulate CRTs and is really great. The main problem being it's HDMI 2.0 output, so you need to choose between 4k60 output with better resolution to emulate CRT masks/scan lines, or 1440p120 for better motion clarity.
Something 4k500 or 4k1000 is likely needed to really replace CRTs completely.
Really hoping by the time 1000 Hz displays are common we do end up with some pass-through box that can fully emulate everything. Emulating full rolling CRT gun scan out should be possible at that refresh rate, which would be amazing.
1000Hz is enough to match CRT quality on a sample-and-hold display, but only when you're displaying 1000fps content. A great many games are limited to 60fps, which means you'll need to either interpolate motion, which adds latency and artifacts, or insert black frames (or better, black lines for a rolling scan, which avoids the latency penalty), which reduces brightness. Adding 16 black frames between every image frame is probably going to reduce brightness to unacceptable levels.
2 replies →
Why stop there? We can simulate the phosphor activation by the electron beam quite accurately with 5 million FPS or so.
And the difference between 480 and 1000 Hz is perceptible?
1 reply →
Such products exist: https://www.retrotink.com/shop
You should probably watch one of the old films about how CRTs were made. It's not a simple process and basically would require setting up a whole factory to mass produce them.
Hobbyist-level production of monochrome TV tubes is possible, but a big effort. Some of the early television restorers have tried.[1] Color, though, is far more complicated. A monochrome CRT just has a phosphor coating inside the glass. A color tube has photo-etched patterns of dots aligned with a metal shadow mask.
CRT rebuilding, where the neck is cut off, a new electron gun installed, and the tube re-sealed and evacuated, used to be part of the TV repair industry. That can be done in a small-scale workshop.
There's a commercial business which still restores CRTs.[2] Most of their work is restoring CRTs for old military avionics systems. But there are a few Sony and Panasonic models for which they have parts and can do restoration.
[1] http://earlytelevision.org/crt_project.html
[2] https://www.thomaselectronics.com
A practical thing about costs is likely shipping. There aren't many consumer products that would be more costly to move around, so you're looking at something as messy as a fridge to sell at the high end.
I imagine one could target smaller CRTs as an idea though.
The whole supply chain is dead. No way the demand is great enough to justify rebooting it.
I know there have been conversations here about simulating crt subpixels on hidpi displays. There are some games that used subpixel rendering to achieve better antialiasing. With hidpi you at least have a chance of doing it well.