Comment by cm2187
2 days ago
A lot of those CRT screens had a pretty low refresh frequency, you were basically sitting in front of a giant stroboscope. That was particular bad for computer screens where you were sitting right in front of them. I think they pretty much all displayed at 30Hz. I can imagine how a gigantic screen can get pretty uncomfortable.
I recall a lot of people playing counterstrike at 640x480 to get at 100+hz refresh rates. The lower the resolution, the faster you can refresh. I don't recall the absolute limit but it would give the latest LCD gaming panels a serious run for their money.
In the meanwhile, oled monitors can go to 480hz.
If you pay extra for that. Meanwhile _any_ CRT could trade off resolution for refresh rate across a fairly wide range. In fact the standard resolutions for monitors were all just individual points in a larger space of possibilities. They could change aspect ratio as well. This can be quite extreme. Consider the 8088 MPH demo from a few years back (<https://trixter.oldskool.org/2015/04/07/8088-mph-we-break-al...>). See the part near the end with the pictures of 6 of the authors? That video mode only had 100 lines, but scrunched up to make a higher resolution.
2 replies →
all CRTs televisions were either 60Hz or 50Hz depending on where you are in the world
Yes and no. Half of the screen was refreshing at a time, so it was really flashing at 30Hz. You still had a visible stroboscopic effect. True 60Hz and 100Hz screen appeared in the late 90s and made a visible difference in term of comfort of viewing.
I think you're mixing monitors and TVs together.
CRT TVs only supported vertical refresh rates of 50Hz or 60Hz, which matched the regional mains frequency. They used interlacing and technically only showed half the frame at a time, but thanks to phosphor decay this added a feeling of fluidity to the image. If you were able to see it strobe, you must have had an impressive sight. And even if they supported higher refresh rates, it wouldn't matter, as the source of the signal would only ever be 50/60Hz.
CRT monitors used in PCs, on the other hand, supported a variety of refresh rates. Only monitors for specific applications used interlacing, customer grade ones didn't, which means you could see a strobing effect here if you ran it at a low frequency. But even the most analog monitors from the 80s supported atleast 640x480 at 60Hz, some programs such as the original DOOM were even able to squeeze 70Hz out of them by running at a different resolution while matching the horizontal refresh rate.
2 replies →
Yeah I remember I could not use a CRT computer monitor at 60Hz or less for any length of time, as the strobing gave me a headache.
I'm guessing you're talking about interlacing?
I've never really experienced it because I've always watched PAL which doesn't have that.
But I would have thought it would be perceived as flashing at 60 Hz with a darker image?
3 replies →
Except CRT televisions weren't like that at all.
The only time the electron gun was not involved in producing visible light was during overscan, horizontal retrace, and the vertical blanking interval. They spent the entire rest of their time (the very vast majority of their time) busily drawing rasterized images onto phosphors (with their own persistence!) for display.
This resulted in a behavior that was ridiculously dissimilar to a 30Hz strobe light.
Did they really do that, or did the tubes just ran at 2x vertically stretched 640x240 with vertical pixel shift? A lot of technical descriptions of CRTs seem to be adapted from pixel addressed LCDs/OLEDs, and they don't always seem to capture the design well
They did exactly what you say. Split the image and pixel shift. It was not like 30Hz at all.
Thanks, I wonder if the phosphor stripes are giving people false impression that those would be pixels...
The limiting factor is the horizontal refresh frequency. TVs and older monitors were around 15.75kHz, so the maximum number of horizontal lines you could draw per second is around 15750. Divide that by 60 and you get 262.5, which is therefore the maximum vertical resolution (real world is lower for various reasons). CGA ran at 200 lines, so was safely possible with a 60Hz refresh rate.
If you wanted more vertical resolution then you needed either a monitor with a higher horizontal refresh rate or you needed to reduce the effective vertical refresh rate. The former involved more expensive monitors, the latter was typically implemented by still having the CRT refresh at 60Hz but drawing alternate lines each refresh. This meant that the effective refresh rate was 30Hz, which is what you're alluding to.
But the reason you're being downvoted is that at no point was the CRT running with a low refresh rate, and best practice was to use a mode that your monitor could display without interlace anyway. Even in the 80s, using interlace was rare.
Interlace was common on platforms like the Amiga, whose video hardware was tied very closely to television refresh frequencies for a variety of technical reasons which also made the Amiga unbeatable as a video production platform. An Amiga could do 400 lines interlaced NTSC, slightly more for PAL Amigas—but any more vertical resolution and you needed later AmigaOS versions and retargetable graphics (RTG) with custom video hardware expansions that could output to higher-freq CRTs like the SVGA monitors that were becoming commonplace...
Amigas supported interlace, but I would strongly disagree that it was common to use it.
CGA ran pretty near 262 or 263 lines, as did many 8-bit computers. 200 addressable lines, yes, but the background color accounted for about another 40 or so lines, and blanking took up the rest.
Everything runs at 262.5 lines at 60Hz on a 15.75KHz display - that's how the numbers work out.
The irony is that most of those who downvote didn't spend hours in front of those screens as I did. And I do remember these things were tiring, particularly in the dark. And the worst of all were computer CRT screens, that weren't interlaced (in the mid 90s, before higher refresh frequency started showing up).
I spent literally thousands of hours staring at those screens. You have it backwards. Interlacing was worse in terms of refresh, not better.
Interlacing is a trick that lets you sacrifice refresh rates to gain greater vertical resolution. The electron beam scans across the screen the same number of times per second either way. With interlacing, it alternates between even and odd rows.
With NTSC, the beam scans across the screen 60 times per second. With NTSC non-interlaced, every pixel will be refreshed 60 times per second. With NTSC interlaced, every pixel will be refreshed 30 times per second since it only gets hit every other time.
And of course the phosphors on the screen glow for a while after the electron beam hits them. It's the same phosphor, so in interlaced mode, because it's getting hit half as often, it will have more time to fade before it's hit again.
9 replies →
If they weren't interlaced then they were updating at 60Hz, even in the 80s. You're being very confidently wrong here.
I did 1024x768@85 just fine.
If it supported it, 100 Hz paired with a mouse set for 200 Hz was nice and smooth.