← Back to context

Comment by kragen

12 days ago

I took film photos of my family's color TV in the 90s with a fast shutter speed. About 20% of the screen was fading at any given time with the (NTSC) 60Hz field rate, the rest being black, so to get a screen that was physically flicker-free with those phosphors, you'd have to refresh it at somewhere around 500Hz. I doubt color CRT monitors had faster-decaying phosphors than my color TV.

I looked up my brightness measurements of a Gateway VX720 VGA monitor (like Diamond Pro 710), and it seems the blue and green phosphors decay to 50% brightness in under 30 microseconds while red stays bright for 300 microseconds. I didn't measure how long it took for them to go to near-black levels. Sadly I never took measurements of my Trinitron TV when I still had it. All in all the results are inconclusive.