Comment by cubefox
1 year ago
Our eyes are constantly and mostly unconsciously tracking moving objects in our field of view in order to keep them still relative to our eyes. It's called Smooth pursuit: https://en.wikipedia.org/wiki/Smooth_pursuit
This is because our retina has a very low "refresh rate", which means things can easily blur together. Smooth pursuit prevents that. However, modern sample-and-hold displays like LCD and OLED work against Smooth pursuit. If you watch anything moving on a screen (including "still" objects moving on screen due to camera movement), your eye will automatically track those objects if they are momentarily the focus of attention, which should make them be still relative to your eyes and thus appear sharp.
However, since the tracked object is being still relative to your eyes and the individual frames on screen are being still relative to your screen, the frames move (are not being still) relative to your eyes. Which means they appear blurry during smooth pursuit, when in reality they should be perfectly sharp.
For example, your eyes track a sign that moves on the screen due to camera movement. Say it moves 10 pixels per frame horizontally. This means you will see a 10 pixel wide horizontal blur on this sign. Which could make it unreadable. In reality (without screen with a real sign) the sign would appear perfectly clear.
On CRT screens this doesn't happen (to the same extent) because the frame is not displayed for the entire frame time (e.g. 1/60th of a second) but much shorter. The CRT just very quickly flashes the frames and is dark in between. Strobing/flickering basically. So if the tracked object moves 10 pixels per frame, the frame might only be (say) visible for 1/5th of that frame time, which means it moves only 2 pixel while the frame is actually on screen. So you get only 2 pixel blur, which is much less.
Of course at 60 FPS you might instead get some degree of perceptible flicker (computer CRTs therefore often ran higher than 60) and in general the overall achievable screen brightness will be darker, since the screen is black most of each frame time. CRTs had a low maximum brightness. But they had very little of the "persistence blur" which plagues sample-and-hold screens like OLED and LCD.
The motion blur intentionally introduced by video games is there to make moving objects appear smoother that are not tracked by our eyes. In that case motion blur is natural (since smooth pursuit doesn't try to remove it). So some forms of motion blur are undesirable and others are desirable.
The optimal solution would be to run games (and videos content in general) at an extremely high frame rate (like 1000 FPS) which would introduce natural perceptible motion blur where it naturally occurs and remove it where it doesn't naturally occur (during smooth pursuit). But obviously that would be computationally an extremely inefficient way to render games.
By the way, if you have a screen with 120+ Hz you can test the above via this black frame insertion demo, which emulates how CRTs work:
https://testufo.com/blackframes
On my 120 Hz OLED screen, the 40 FPS (1 frame + 2 black frames) UFO looks as clear as the native 120 Hz UFO. A real 60 or even 80 Hz CRT screen would be even better in terms of motion clarity. Perhaps better than a 240 or even 480 Hz OLED.
No comments yet
Contribute on Hacker News ↗