Comment by perching_aix
12 days ago
It's always surprising for me to see people regard 60 Hz CRT as "flicker-free", or "minimal flicker", etc. Whenever I saw a CRT running at 60 Hz, I'd be immediately be able to tell. Always used at minimum 75 Hz but preferably 85 Hz at home (early 2000s, Windows).
Have you ever seen something running at 30 Hz? Or even 15? The difference in flicker between 30 and 60 is much much larger than the difference between 60 and 120! Yeah 60 isn't flicker free, any finite number is not (there is probably quantum limits), but realistically you reach a point where you can't really tell. For most purposes 60Hz is close enough, though you can still tell.
I don't remember frankly. For what it's worth, TV sets would always be 50 Hz here (PAL) (unless they did some tomfoolery I'm not aware of and ran at 100 Hz "in secret" or something) and evidently I could watch those on end without too many holdups for years and years, so clearly it wasn't a dealbreaker. But on monitors, yeah, I just wouldn't tolerate it, whereas 85 Hz felt perfect (no discernible flicker for me that I'd recall).
- I hear that some later digital PAL TVs stored an image in a framebuffer and scanned it out twice at 100 Hz, which retro gamers today avoid because it increases latency relative to direct scanout.
- I've heard mixed reports over whether CRT monitors had faster-decaying phosphors than televisions. Maybe part of it is a computer has a white image, which causes more noticeable flicker than a dark background with white text (or darker TV scenes).
4 replies →
That's interesting. 60hz TVs always gave me headaches but my 75 hz computer monitor didn't.
I think it was actually the interlacing and not the refresh rate that did it.
> TV sets would always be 50 Hz here (PAL)
but only half the screen at a time so in effect every other line was flickering at 25Hz
I have recently been playing with CRTs again, and something that I have noticed is that for fast-paced games running at 60 or 70 Hz* I don't notice the flicker much, but for text anything less than 85 Hz is headache inducing. Luckily the monitor I got can do 1024x768 at 100 Hz :)
* The original VGA and thus most MS-DOS games ran at 70 Hz.
I remember when I got my first computer for myself, instead of sharing with others, it was "obvious requirement" that the screen runs at least 72Hz, preferably higher. Which was why 15" CRT had to run at 800x600.
Later on, and with graphic card that had more than 2MB of RAM, I remember experimenting a lot with modelines to pull higher refresh rates and higher resolution on the 17" CRT I inherited when my father switched to a laptop :)
Our first PC was a 486 with a crappy 15" monitor, but I was too young to remember the details clearly. I can consider myself lucky, my dad was a big fan of Sony and we soon moved to a nice 17" Trinitron :)
1 reply →
On a green ZnS:Cu phosphor, even 20Hz is minimal flicker.
Monochrome CRT phosphors like P4 (zinc sulfide w silver) have longer persistence than ones used in color CRTs, so flicker is less noticeable.
>Whenever I saw a CRT running at 60 Hz, I'd be immediately be able to tell. Always used at minimum 75 Hz but preferably 85 Hz at home (early 2000s, Windows).
Same, I remember installing some program that would let you quickly change the display settings on basically every computer I ever interacted with. It was especially bad if the crt was in a room with fluorescent lighting.
If your lighting and display have flicker at mathematical ratio you will notice unless the frequency is extremely high. 1:1 is most likely because it is easy to sync lights and the CRT to the AC line frequency which is 60Hz in the US (50Hz in Europe). 1:2 (used to be somewhat common) or 4:5 ratios would also cause issues.
Though now that I think of it, the CRT should be syncing with the signal and there is no reason that sync needs to be related to the AC line, but it does anyway (all the computers I know of generate their own sync from a crystal, I have no idea where TV stations get their sync but I doubt AC line frequency).
Wikipedia for NTSC alludes to a couple reasons why you'd want your refresh rate to be based on your power line frequency:
> Matching the field refresh rate to the power source avoided intermodulation (also called beating), which produces rolling bars on the screen. Synchronization of the refresh rate to the power incidentally helped kinescope cameras record early live television broadcasts, as it was very simple to synchronize a film camera to capture one frame of video on each film frame by using the alternating current frequency to set the speed of the synchronous AC motor-drive camera.
(I suspect shows that were pre-recorded and telecined for broadcast would've also been filmed at 30fps using a synchronous AC motor.)
> In early TV systems, a master voltage-controlled oscillator was run at twice the horizontal line frequency, and this frequency was divided down by the number of lines used (in this case 525) to give the field frequency (60 Hz in this case). This frequency was then compared with the 60 Hz power-line frequency and any discrepancy corrected by adjusting the frequency of the master oscillator.
I think later TVs would've just synchronized to the received signal.
https://en.wikipedia.org/wiki/NTSC#Resolution_and_refresh_ra...
Me too. I'm also really sensitive to PWM. I tried using 85Hz on my VGA monitor but the higher signal bandwidth and cheap hardware made the video noticeably blurrier. 70 wasn't a great compromise either.
Since TFTs came I was bothered a lot less by it because the lack of flicker (though some 4 bit cheap TN LCDs still had it with some colours)