Comment by nyanpasu64

13 days ago

- I hear that some later digital PAL TVs stored an image in a framebuffer and scanned it out twice at 100 Hz, which retro gamers today avoid because it increases latency relative to direct scanout.

- I've heard mixed reports over whether CRT monitors had faster-decaying phosphors than televisions. Maybe part of it is a computer has a white image, which causes more noticeable flicker than a dark background with white text (or darker TV scenes).

I took film photos of my family's color TV in the 90s with a fast shutter speed. About 20% of the screen was fading at any given time with the (NTSC) 60Hz field rate, the rest being black, so to get a screen that was physically flicker-free with those phosphors, you'd have to refresh it at somewhere around 500Hz. I doubt color CRT monitors had faster-decaying phosphors than my color TV.

  • I looked up my brightness measurements of a Gateway VX720 VGA monitor (like Diamond Pro 710), and it seems the blue and green phosphors decay to 50% brightness in under 30 microseconds while red stays bright for 300 microseconds. I didn't measure how long it took for them to go to near-black levels. Sadly I never took measurements of my Trinitron TV when I still had it. All in all the results are inconclusive.

My old amber and green CRTs definitely had slower phosphor than any TV. They couldn't show a normal TV frame rate without huge ghosting. They also didn't have noticeable flicker though even in black on white mode (some programs could do that and my monitors also had an inverse video button)