← Back to context

Comment by Terr_

2 months ago

> Our eyes can see both just fine.

This gets to a gaming rant of mine: Our natural vision can handle these things because our eyes scan sections of the scene with constant adjustment (light-level, focus) while our brain is compositing it together into what feels like a single moment.

However certain effects in games (i.e. "HDR" and Depth of Field) instead reduce the fidelity of the experience. These features limp along only while our gaze is aimed at the exact spot the software expects. If you glance anywhere else around the scene, you instead percieve an unrealistically wrong coloration or blur that frustratingly persists no matter how much you squint. These problems will remain until gaze-tracking support becomes standard.

So ultimately these features reduce the realism of the experience. They make it less like being there and more like you're watching a second-hand movie recorded on flawed video-cameras. This distinction is even clearer if you consider cases where "film grain" is added.

https://www.realtimerendering.com/blog/thought-for-the-day/

It's crazy that post is 15 years old. Like the OP and this post get at, HDR isn't really a good description of what's happening. HDR often means one or more of at least 3 different things (capture, storage, and presentation). It's just the sticker slapped on advertising.

Things like lens flares, motion blur, film grain, and shallow depth of field are mimicking cameras and not what being there is like--but from a narrative perspective we experience a lot of these things through tv and film. Its visual shorthand. Like Star Wars or Battlestar Galactica copying WWII dogfight footage even though it's less like what it would be like if you were there. High FPS television can feel cheap while 24fps can feel premium and "filmic."

Often those limitations are in place so the experience is consistent for everyone. Games will have you set brightness and contrast--I had friends that would crank everything up to avoid jump scares and to clearly see objects intended to be hidden in shadows. Another reason for consistent presentation is for unfair advantages in multiplayer.

  • > Things like lens flares, motion blur, film grain, and shallow depth of field are mimicking cameras and not what being there is like

    Ignoring film grain, our vision has all these effects all the same.

    Look in front of you and only a single plane will be in focus (and only your fovea produces any sort of legibility). Look towards a bright light and you might get flaring from just your eyes. Stare out the side of a car or train when driving at speed and you'll see motion blur, interrupted only by brief clarity if you intentionally try to follow the motion with your eyes.

    Without depth of field simulation, the whole scene is just a flat plane with completely unrealistic clarity, and because it's comparatively small, too much of it is smack center on your fovea. The problem is that these are simulations that do not track your eyes, and make the (mostly valid!) assumption that you're looking, nearby or in front of whatever you're controlling.

    Maybe motion blur becomes unneccessary given a high enough resolution and refresh rate, but depth of field either requires actual depth or foveal tracking (which only works for one person). Tasteful application of current techniques is probably better.

    > High FPS television can feel cheap while 24fps can feel premium and "filmic."

    Ugh. I will never understand the obsession this effect. There is no such thing as a "soap opera effect" as people liek to call it, only a slideshow effect.

    The history behind this is purely a series of cost-cutting measures entirely unrelated to the user experience or artistic qualities. 24 fps came to be because audio was slapped onto the film, and was the slowest speed where the audio track was acceptable intelligible, saving costly film paper - the sole priority of the time. Before that, we used to record content at variable frame rates but play it back at 30-40 fps.

    We're clinging on to a cost-cutting measure that was a significant compromise from the time of hand cranked film recording.

    </fist-shaking rant>

    • > Look in front of you and only a single plane will be in focus (and only your fovea produces any sort of legibility). Look towards a bright light and you might get flaring from just your eyes. Stare out the side of a car or train when driving at speed and you'll see motion blur, interrupted only by brief clarity if you intentionally try to follow the motion with your eyes.

      The problem is the mismatch between what you’re looking at on the screen and what the in-game camera is looking at. If these were synchronised perfectly it wouldn’t be a problem.

      2 replies →

    • > Ugh. I will never understand the obsession this effect.

      All of it (lens flares, motion blur, film grain, DoF, tone mapping, and exposure, frame rate) are artistic choices constrained by the equipment we have to collect and present it. I think they'll always follow trends. In my entire career following film, photography, computer graphics, and game dev the only time I've heard anyone talk about how we experience any of those things is when people say humans see roughly equivalent of a 50mm lens (on 35mm film).

      Just look at the trend of frame size. Film was roughly 4:3, television copied it. Film started matting/cropping the frame. It got crazy with super wide-screen to where some films used 3 projectors side-by-side and most settled on 16:9. Then television copied it. Widescreen is still seen as more "filmic." I remember being surprised working on a feature that switched to Cinemascope's aspect ratio and seeing that was only 850 pixels tall--a full frame would be about twice that.

      To me, high frame rate was always just another style. My only beef was with motion-smoothing muddying up footage shot at different frame rates.

      2 replies →

  • > the poster found it via StumbleUpon.

    Such a blast from the past, I used to spend so much time just clicking that button!

I'm with you on depth of field, but I don't understand why you think HDR reduces the fidelity of a game.

If you have a good display (eg an OLED) then the brights are brighter and simultaneously there is more detail in the blacks. Why do you think that is worse than SDR?

  • Check out this old post: https://www.realtimerendering.com/blog/thought-for-the-day/

    HDR in games would frequently mean clipping highlights and adding bloom. Prior the "HDR" exposure looked rather flat.

    • That's not what it means since 2016 or so when consumer TVs got support for properly displaying brighter whites and colors.

      It definitely adds detail now, and for the last 8-9 years.

      Though consumer TVs obviously still fall short of being as bright at peak as the real world. (We'll probably never want our TV to burn out our vision like the sun, though, but probably hitting highs at least in the 1-2000nit range vs the 500-700 that a lot peak at right now would be nice for most uses.

  • The “HDR” here is in the sense of “tone mapping to SDR”. Should also be said that even “H” DR displays only have a stop or two of more range, still much less than in a real-world high-contrast scenes

The most egregious example is 3D. Only one thing is in focus, even though the scene is stereoscopic. It makes no sense visually.

  • Hell yeah, this one of many issues I had with the first Avatar movie. The movie was so filled with cool things to look at but none of it was in focus. 10 minutes in I had had enough and was ready for a more traditional movie experience. Impressive yes, for 10 minutes, then exhausting.

    • this thread is helping me understand why I always thought 3D movies looked _less_ 3D than 2D movies.

      That and after seeing Avatar 1 in 3D, then seeing Avatar 2 in 3D over 10 years later and not really noticing any improvement in the 3D made me declare 3D movies officially dead (though I haven’t done side by side comparisons)

I had a similar complaint with the few 3D things I watched when that has been hyped in the past (e.g., when Avatar came out in cinemas, and when 3D home TVs seemed to briefly become a thing 15 years ago). It felt like Hollywood was giving me the freedom to immerse myself, but then simultaneously trying to constrain that freedom and force me to look at specific things in specific ways. I don't know what the specific solution is, but it struck me that we needed to be adopting lessons from live stage productions more than cinema if you really want people to think what they're seeing is real.

  • Stereo film has its own limitations. Sadly, shooting for stereo was expensive and often corners were cut just to get it to show up in a theater where they can charge a premium for a stereo screening. Home video was always a nightmare--nobody wants to wear glasses (glassesless stereo TVs had a very narrow viewing angle).

    It may not be obvious, but film has a visual language. If you look at early film, it wasn't obvious if you cut to something that the audience would understand what was going on. Panning from one object to another implies a connection. It's built on the visual language of still photography (things like rule of thirds, using contrast or color to direct your eye, etc). All directing your eye.

    Stereo film has its own limitations that were still being explored. In a regular film, you would do a rack focus to connect something in the foreground to the background. In stereo, when there's a rack focus people don't follow the camera the same way. In regular film, you could show someone's back in the foreground of a shot and cut them off at the waist. In stereo, that looks weird.

    When you're presenting something you're always directing where someone is looking--whether its a play, movie, or stereo show. The tools are just adapted for the medium.

    I do think it worked way better for movies like Avatar or How to Train Your Dragon and was less impressive for things like rom coms.

HDR, not "HDR", is the biggest leap in gaming visuals made in the last 10 years, I think.

Sure, you need a good HDR-capable display and a native HDR-game (or RTX HDR), but the results are pretty awesome.

These effects are for the artistic intent of the game. Same goes for movies, and has nothing to do with "second hand movies recorded on flawed cameras". or with "realism" in the sense of how we perceive the world.

This is why I always turn off these settings immediately when I turn on any video game for the first time. I could never put my finger on why I didn’t like it, but the camera analogy is perfect