← Back to context

Comment by arghwhat

21 hours ago

> Things like lens flares, motion blur, film grain, and shallow depth of field are mimicking cameras and not what being there is like

Ignoring film grain, our vision has all these effects all the same.

Look in front of you and only a single plane will be in focus (and only your fovea produces any sort of legibility). Look towards a bright light and you might get flaring from just your eyes. Stare out the side of a car or train when driving at speed and you'll see motion blur, interrupted only by brief clarity if you intentionally try to follow the motion with your eyes.

Without depth of field simulation, the whole scene is just a flat plane with completely unrealistic clarity, and because it's comparatively small, too much of it is smack center on your fovea. The problem is that these are simulations that do not track your eyes, and make the (mostly valid!) assumption that you're looking, nearby or in front of whatever you're controlling.

Maybe motion blur becomes unneccessary given a high enough resolution and refresh rate, but depth of field either requires actual depth or foveal tracking (which only works for one person). Tasteful application of current techniques is probably better.

> High FPS television can feel cheap while 24fps can feel premium and "filmic."

Ugh. I will never understand the obsession this effect. There is no such thing as a "soap opera effect" as people liek to call it, only a slideshow effect.

The history behind this is purely a series of cost-cutting measures entirely unrelated to the user experience or artistic qualities. 24 fps came to be because audio was slapped onto the film, and was the slowest speed where the audio track was acceptable intelligible, saving costly film paper - the sole priority of the time. Before that, we used to record content at variable frame rates but play it back at 30-40 fps.

We're clinging on to a cost-cutting measure that was a significant compromise from the time of hand cranked film recording.

</fist-shaking rant>

> Ugh. I will never understand the obsession this effect.

All of it (lens flares, motion blur, film grain, DoF, tone mapping, and exposure, frame rate) are artistic choices constrained by the equipment we have to collect and present it. I think they'll always follow trends. In my entire career following film, photography, computer graphics, and game dev the only time I've heard anyone talk about how we experience any of those things is when people say humans see roughly equivalent of a 50mm lens (on 35mm film).

Just look at the trend of frame size. Film was roughly 4:3, television copied it. Film started matting/cropping the frame. It got crazy with super wide-screen to where some films used 3 projectors side-by-side and most settled on 16:9. Then television copied it. Widescreen is still seen as more "filmic." I remember being surprised working on a feature that switched to Cinemascope's aspect ratio and seeing that was only 850 pixels tall--a full frame would be about twice that.

To me, high frame rate was always just another style. My only beef was with motion-smoothing muddying up footage shot at different frame rates.

> Look in front of you and only a single plane will be in focus (and only your fovea produces any sort of legibility). Look towards a bright light and you might get flaring from just your eyes. Stare out the side of a car or train when driving at speed and you'll see motion blur, interrupted only by brief clarity if you intentionally try to follow the motion with your eyes.

The problem is the mismatch between what you’re looking at on the screen and what the in-game camera is looking at. If these were synchronised perfectly it wouldn’t be a problem.