Comment by pfranz
2 months ago
> The problem is that it just doesn't work on modern, fast displays.
I'm very confused by this. From what I've seen it's been getting a lot better (since transitioning from CRTs). At least for television, frame-rate matching is becoming more of a thing. Higher frame rates really help. Calling everything fps for simplicity; 120 divides evenly by 24, 30, and 60. Lower values wont match and cause issues.
Similarly, (maybe back in the 90s?) projectors in theaters would double-expose each frame to reduce the flicker in between frames. With digital, they no longer have to advance the film between frames.
> smooth motion like credits at certain speeds are extremely uncomfortable to look at at these frame rates.
I think scrolling credits are the most difficult use case: white on black with hard text and no blur. DLP projectors (common 10+ years ago) drive me nuts displaying R G and B separately.
Outside of credits, cinematographers and other filmmakers do think about these things. I remember hearing a cinematographer talk about working on space documentaries in Imax. If you panned too quickly, the white spaceship over a black star field could jump multiple feet each frame. Sure films shot today are optimized for the theater, but the technology gap between theater and home is nowhere near as crazy as CRT vs acetate.
> Frame size is different from the other parameters, as it is solely a physical practicality.
I'm still struggling to see how its that different. Widescreen meant a lower effective resolution (it didn't have to--it started with Cinerama and Cinemascope), but was adopted for cost and aesthetic reasons.
> If some technology somewhere else in the stack causes a change…and soon all content aligns on the format, and the majority of home TV sets will be shaped to fit the majority content it can receive.
And the industry and audiences are really attached to 24fps. Like you say, home televisions adopted film's aspect ratio and I've also seen them adopt much better support for 24fps.
As kind of an aside, I wonder if the motion blur is what people are attached to more than the actual frame rate. I assume you're talking about frame rates higher than 30? Sure, we have faster films and brighter lights, but exposure time is really short. I saw the Hobbit in theaters in both high frame rate and 24fps and the 24fps one looked weird to me, too--I meant to look it up, but I assume they just dropped frames making the blur odd.
No comments yet
Contribute on Hacker News ↗