← Back to context

Comment by pfranz

2 months ago

> Ugh. I will never understand the obsession this effect.

All of it (lens flares, motion blur, film grain, DoF, tone mapping, and exposure, frame rate) are artistic choices constrained by the equipment we have to collect and present it. I think they'll always follow trends. In my entire career following film, photography, computer graphics, and game dev the only time I've heard anyone talk about how we experience any of those things is when people say humans see roughly equivalent of a 50mm lens (on 35mm film).

Just look at the trend of frame size. Film was roughly 4:3, television copied it. Film started matting/cropping the frame. It got crazy with super wide-screen to where some films used 3 projectors side-by-side and most settled on 16:9. Then television copied it. Widescreen is still seen as more "filmic." I remember being surprised working on a feature that switched to Cinemascope's aspect ratio and seeing that was only 850 pixels tall--a full frame would be about twice that.

To me, high frame rate was always just another style. My only beef was with motion-smoothing muddying up footage shot at different frame rates.

The problem is that it just doesn't work on modern, fast displays. Without motion smoothing on a fast and bright screen, 24fps/30fps goes from "choppy" to "seizure inducing and unwatchable". Older sets would just naturally smooth things out.

Even on my LCD TV, smooth motion like credits at certain speeds are extremely uncomfortable to look at at these frame rates.

I consider it borderline irresponsible to continue using these framerates, forcing users into frame interpolation and horrible artifacts, a decision the manufacturer might even have made for them. 120 Hz is finally becoming the norm for regular content (with monitors going to 500+ nowadays), we should at least be able to get to 60 Hz as the lower bound for regular content delivery.

Going further down for artistic value, e.g. for stop motion or actual slide shows is less of a problem in my opinion. It is not as disturbing, and if regular content was appropriately paced there would be no need for interpolation to mess with it...

> Just look at the trend of frame size.

Frame size is different from the other parameters, as it is solely a physical practicality. Bigger is better in all directions, but a cinema screen needs to fit in the building - making a building much taller is less economical than making it wider, and making it whatever it isn't right now adds novelty.

The content needs to be made for the screen with the appropriate balance of periphery and subject to not be completely wrong, so screen technology and recording technology tends to align. Economy of scale causes standardization on lenses and image circles, and the choice of aspect ratio within that circle on the film, forming a feedback loop that enforces the parameters for almost all content.

If some technology somewhere else in the stack causes a change, some will follow for the novelty but others will simply follow the reducing cost, and soon all content aligns on the format, and the majority of home TV sets will be shaped to fit the majority content it can receive.

  • > The problem is that it just doesn't work on modern, fast displays.

    I'm very confused by this. From what I've seen it's been getting a lot better (since transitioning from CRTs). At least for television, frame-rate matching is becoming more of a thing. Higher frame rates really help. Calling everything fps for simplicity; 120 divides evenly by 24, 30, and 60. Lower values wont match and cause issues.

    Similarly, (maybe back in the 90s?) projectors in theaters would double-expose each frame to reduce the flicker in between frames. With digital, they no longer have to advance the film between frames.

    > smooth motion like credits at certain speeds are extremely uncomfortable to look at at these frame rates.

    I think scrolling credits are the most difficult use case: white on black with hard text and no blur. DLP projectors (common 10+ years ago) drive me nuts displaying R G and B separately.

    Outside of credits, cinematographers and other filmmakers do think about these things. I remember hearing a cinematographer talk about working on space documentaries in Imax. If you panned too quickly, the white spaceship over a black star field could jump multiple feet each frame. Sure films shot today are optimized for the theater, but the technology gap between theater and home is nowhere near as crazy as CRT vs acetate.

    > Frame size is different from the other parameters, as it is solely a physical practicality.

    I'm still struggling to see how its that different. Widescreen meant a lower effective resolution (it didn't have to--it started with Cinerama and Cinemascope), but was adopted for cost and aesthetic reasons.

    > If some technology somewhere else in the stack causes a change…and soon all content aligns on the format, and the majority of home TV sets will be shaped to fit the majority content it can receive.

    And the industry and audiences are really attached to 24fps. Like you say, home televisions adopted film's aspect ratio and I've also seen them adopt much better support for 24fps.

    As kind of an aside, I wonder if the motion blur is what people are attached to more than the actual frame rate. I assume you're talking about frame rates higher than 30? Sure, we have faster films and brighter lights, but exposure time is really short. I saw the Hobbit in theaters in both high frame rate and 24fps and the 24fps one looked weird to me, too--I meant to look it up, but I assume they just dropped frames making the blur odd.