Comment by arghwhat
1 day ago
> Things like lens flares, motion blur, film grain, and shallow depth of field are mimicking cameras and not what being there is like
Ignoring film grain, our vision has all these effects all the same.
Look in front of you and only a single plane will be in focus (and only your fovea produces any sort of legibility). Look towards a bright light and you might get flaring from just your eyes. Stare out the side of a car or train when driving at speed and you'll see motion blur, interrupted only by brief clarity if you intentionally try to follow the motion with your eyes.
Without depth of field simulation, the whole scene is just a flat plane with completely unrealistic clarity, and because it's comparatively small, too much of it is smack center on your fovea. The problem is that these are simulations that do not track your eyes, and make the (mostly valid!) assumption that you're looking, nearby or in front of whatever you're controlling.
Maybe motion blur becomes unneccessary given a high enough resolution and refresh rate, but depth of field either requires actual depth or foveal tracking (which only works for one person). Tasteful application of current techniques is probably better.
> High FPS television can feel cheap while 24fps can feel premium and "filmic."
Ugh. I will never understand the obsession this effect. There is no such thing as a "soap opera effect" as people liek to call it, only a slideshow effect.
The history behind this is purely a series of cost-cutting measures entirely unrelated to the user experience or artistic qualities. 24 fps came to be because audio was slapped onto the film, and was the slowest speed where the audio track was acceptable intelligible, saving costly film paper - the sole priority of the time. Before that, we used to record content at variable frame rates but play it back at 30-40 fps.
We're clinging on to a cost-cutting measure that was a significant compromise from the time of hand cranked film recording.
</fist-shaking rant>
> Ugh. I will never understand the obsession this effect.
All of it (lens flares, motion blur, film grain, DoF, tone mapping, and exposure, frame rate) are artistic choices constrained by the equipment we have to collect and present it. I think they'll always follow trends. In my entire career following film, photography, computer graphics, and game dev the only time I've heard anyone talk about how we experience any of those things is when people say humans see roughly equivalent of a 50mm lens (on 35mm film).
Just look at the trend of frame size. Film was roughly 4:3, television copied it. Film started matting/cropping the frame. It got crazy with super wide-screen to where some films used 3 projectors side-by-side and most settled on 16:9. Then television copied it. Widescreen is still seen as more "filmic." I remember being surprised working on a feature that switched to Cinemascope's aspect ratio and seeing that was only 850 pixels tall--a full frame would be about twice that.
To me, high frame rate was always just another style. My only beef was with motion-smoothing muddying up footage shot at different frame rates.
The problem is that it just doesn't work on modern, fast displays. Without motion smoothing on a fast and bright screen, 24fps/30fps goes from "choppy" to "seizure inducing and unwatchable". Older sets would just naturally smooth things out.
Even on my LCD TV, smooth motion like credits at certain speeds are extremely uncomfortable to look at at these frame rates.
I consider it borderline irresponsible to continue using these framerates, forcing users into frame interpolation and horrible artifacts, a decision the manufacturer might even have made for them. 120 Hz is finally becoming the norm for regular content (with monitors going to 500+ nowadays), we should at least be able to get to 60 Hz as the lower bound for regular content delivery.
Going further down for artistic value, e.g. for stop motion or actual slide shows is less of a problem in my opinion. It is not as disturbing, and if regular content was appropriately paced there would be no need for interpolation to mess with it...
> Just look at the trend of frame size.
Frame size is different from the other parameters, as it is solely a physical practicality. Bigger is better in all directions, but a cinema screen needs to fit in the building - making a building much taller is less economical than making it wider, and making it whatever it isn't right now adds novelty.
The content needs to be made for the screen with the appropriate balance of periphery and subject to not be completely wrong, so screen technology and recording technology tends to align. Economy of scale causes standardization on lenses and image circles, and the choice of aspect ratio within that circle on the film, forming a feedback loop that enforces the parameters for almost all content.
If some technology somewhere else in the stack causes a change, some will follow for the novelty but others will simply follow the reducing cost, and soon all content aligns on the format, and the majority of home TV sets will be shaped to fit the majority content it can receive.
> Look in front of you and only a single plane will be in focus (and only your fovea produces any sort of legibility). Look towards a bright light and you might get flaring from just your eyes. Stare out the side of a car or train when driving at speed and you'll see motion blur, interrupted only by brief clarity if you intentionally try to follow the motion with your eyes.
The problem is the mismatch between what you’re looking at on the screen and what the in-game camera is looking at. If these were synchronised perfectly it wouldn’t be a problem.
Indeed - I also mentioned that in the paragraph immediately following.
Derp