← Back to context

Comment by TeMPOraL

2 days ago

> Modern taste is more about more neutral-colored foundations with color accents. Don't paint a whole room green -- have a gorgeous green plant that stands out all the more against its neutral background. Don't paint a whole wall orange -- have a beautiful orange-hued piece of art on the wall. It's just more tasteful to use color as one element, along with size, shape, texture, and so forth.

I don't consider this to be a be-all, end-all of design, but I appreciate that designs following this approach can be stunningly beautiful. That said, this is not the problem. The problem is, what happens these days, someone films your room with that "gorgeous green plant that stands out all the more against its neutral background" and... color grades the shit out of color, making it near pitch-black on non-HDR TVs (and most computer screens) and merely grey with tiny amounts of trace color on HDR TVs.

This is the problem - or at least its TV aspect. That Napoleon example was spot on - most movies these days look like the right half, whereas anything remotely approaching realism would make it look like the left half. And TFA correctly notices the same washing out of colors is happening to products and spaces in general (which means double trouble when that's filmed and then color-graded some more).

The drained-color thing is exclusive to a certain type of TV/movie drama, and then also a serious technical problem involving HDR device-side (which is a whole other story).

But if you watch any comedy, or reality show, or plenty of "normal" dramas, on a regular TV, the color is normal.

However, yes, there has been a certain trend involving Christopher Nolan, "gritty realism", and legal-political-military-crime themes, to do color grading to massively reduce saturation and aggressively push towards blue. I don't like it much but you can also just not watch that stuff. It's stylistic the same way film noir was. Some people hated that back in the day too, now it's just seen as a style of the time.

  • > The drained-color thing is exclusive to a certain type of TV/movie drama

    It's not. There's even a term coined for it, "intangible sludge". https://www.vox.com/culture/22840526/colors-movies-tv-gray-d...

    > I don't like it much but you can also just not watch that stuff.

    It's now permeated everything, so it's hard to not watch stuff, as it's everywhere, with few exceptions.

    • Right. For a long time I wondered what's going on, and eventually started believing it's my fault - that maybe I'm just a rare HDR-poor person watching TV shows on SDR computer displays, maybe I've hit an unusual corner case in the video decoding path, or something. I kept believing that until Star Trek: Picard, Season 3, which made it clear it's not me, it's them.

      The whole show, like everything in the past decade or so, was dark and washed out (except for some space FX parts, where at least some colors were saturated, sometimes). This lasted up until the last two episodes, where for plot reasons[0], some protagonists found themselves onboard a ship from TNG-era shows (1980s - 2000s), pulled straight from a museum, which means the set was recreated as it was on old shows, complete with the lighting. From that scene onwards through the final episode, as it jumped between that one special set and every other dark and gray scene, I had proof in front of me that scenes in modern shows can be properly lit, they just aren't, and it's an active choice[1].

      Importantly, this scene wasn't a one-off gimmick that risked coming out too bright on normal people's HDR-enabled TV screens. The set involved was, per the showrunner, pretty much the whole raison d'etre for the entire season, and they burned most of the season's budget on perfecting it[2]. Them being able to light it well (and have it coexist with every other badly-lit scene) only proves there's no technical obstacle involved - that dark and washed out TV is just a choice everyone's making for... unclear reasons.

      --

      [0] - Hard to navigate around a major spoiler and highlight of the era in the franchise.

      [1] - Actually, I can't give this scene enough justice. But given how massive moment that was for people following the franchise, I'll just provide a link to the video (SPOILER WARNING): https://youtu.be/t-mY4Xbjyn8?t=42 -- watch in max quality; compare okay-ish exterior CG early on, observe how dark and washed out scenes with people are - and this is literally how the entire season (and really, entire show) was until that point... or just scroll to 2:27, and then on a perfect cue - "computer, lights!" - observe how next 30 seconds reveal that everything could've been properly lit from the start, but for some non-tech reason, it wasn't.

      [2] - Most of that was eaten up by casting very specific people, but the set itself was damn expensive too.

      4 replies →

    • > It's not. There's even a term coined for it, "intangible sludge".

      It is. The article you link even begins:

      > So many TV shows and movies now

      That's what I'm taking about. Those "so many" belong mostly to a certain type of drama.

      You're not seeing it in comedies. You're not seeing it in reality shows. There are also plenty of dramas that don't have it, possibly a majority but I'm not sure.

      It's not everywhere, contrary to what you say. It may, however, seem "everywhere" if you're only watching that type of drama.

      4 replies →

  • > The drained-color thing is exclusive to a certain type of TV/movie drama

    You're absolutely wrong, it happened to video games too. The industry defended it by saying it made games look more "realistic", but have since backed off after consumers revolted and dubbed the aesthetic "piss filter."

    Started in the mid 00s, went strong for about a decade and still persists to a lesser degree today. Only designers like it, consumers broadly hate it.

    • I meant a certain type within TV/movies. As opposed to other types of TV/movies.

      I can't speak to video games, but of course it would make sense it would apply to dramatic video games as well.

> color grades the shit out of color

Color grading itself isn't the problem. It's just a creative tool that can be used well or poorly. The problem is the intentional stylistic choices being made with the tool. I don't have strong opinions about TFAs arguments re: color in general but as someone deep into cinema production technology, there's a troubling lack of visual diversity in modern cinema and it's not just color, it's dynamic range and texture too.

It's crazy because this is happening in an era when digital cinema workflows from cameras to file formats to post-production allow everyone to capture, manipulate and distribute visuals with unprecedented levels of fidelity and dynamic range. Even DSLRs down to $3000 can capture full frame 4k camera raw with >14 stops of dynamic range which is insane. The great cinematographers of the past needed incredible skill to capture dynamic range from deep shadows to punchy highlights on film and it was always a risk since they had to wait for dailies. And they had little latitude to manipulate the image captured on the camera negative in post.

Today's imagers, formats and tools make capturing immense dynamic range not only fast and easy but cheap and virtually risk-free yet so much cinema looks flat and boring - and there's no technical reason for it. This video shows compelling examples contrasting recent movies with those shot on film in the 90s but also movies shot on much less capable digital cinema cameras in the early 2000s proving it's not digital or grading that's driving this. "Why don't movies look like movies any more?" https://www.youtube.com/watch?v=EwTUM9cFeSo.

According to Hollywood cinematographers in the video it's partly intentional artistic choices, part the impact of composing and lighting for HDR, part lack of creativity and production skill and a big part over focus on flat lighting for VFX shots (because the more expressive the digital camera negative is, the harder it is for VFX teams to match with CGI). I'd add another factor which is that younger cinematographers, LDs and camera ops who learned on high dynamic range digital cinema cameras have been trained to shoot a flat LUT. While this technically maximizes the latitude available for color grading in post (which is generally a good thing), the issue is that many extend this to composing and lighting shots that have virtually no expressive look in the captured digital negative at all. Color grading in post should be for small tweaks, conforming shot-to-shot variance, mastering and, occasionally, saving the day when something goes wrong with a shot. While modern editing and grading tools are immensely powerful, re-framing and grading in post cannot substitute for creative on-set lighting, lensing, composition and exposure choices. Great cinematographers still create their looks with lighting, lens and camera as if there were going to be no grading in post. Unfortunately, this seems to increasingly be an under-valued skill.

The requirements of modern VFX also contribute in an indirect way as well. It takes on-set time and energy for the camera teams to capture and check the increasingly complex list of clean plates, reflection map spheres and color/contrast references with specialized LUTs and metadata at a variety of apertures for every shot. This takes time away from traditional lighting and composition and ultimately producers don't budget enough time. When something has to give - it's not going to be the VFX plates. In modern effects-heavy productions, the VFX director always has a team on-set for every shot verifying they're getting what they need. While this is necessary and understandable, unfortunately, the reverse is rarely true. The cinematographer is not supervising the lighting and composition of all the major VFX elements because they are being produced by a dozen different vendors over a year-long post-production cycle. This can still work when you have a director like a James Cameron who's hands-on throughout the process and has top-notch VFX director and cinematography skills. But that's not the norm. This creates systemic incentives for directors, cinematographers and LDs to lens flat, unexpressive shots. Because if there's not consistent, hand-on creative direction over the whole process, the editor and colorist are left trying to stitch together a bunch of shots and elements that weren't created to exist cohesively in the same frame. I suspect not managing this complexity is how visual disasters like Ant-Man and the Wasp: Quantumania happen.

Sadly, there's no reason it has to be this way. Technically, it's entirely possible to create a VFX-heavy movie that looks like every part of every frame was lensed by a master like Bernardo Bertolucci. There's nothing required that's even that hard or expensive compared to modern VFX blockbuster complexity or budgets. I think the reason we haven't seen it yet is two-fold: today's top producers, directors and cinematographers rarely have the new and diverse skill sets required in one person and none of the few with the skills and experience has had both the creative intention and budget to do it. I'm actually hopeful that maybe in the next few years someone like a Nolan or Cameron will decide to try to take it to this level as an aspiration. Currently, many of those with the budgets and cred are choosing to address the challenge by reverting to creating effects with practical sets and in-camera techniques. This can avoid the problem but it's looking backward instead of embracing the challenge and doing the pioneering work of figuring out how to push through and solve it. Whoever does it may discover all-new creative and expressive capabilities.

  • The video you link has turned into a classic.

    But I also disagree with its claim that black shadows everywhere are "cinematic" and desirable.

    They're a limitation of film at the time. When I watch those classic movies, I don't like the fact that all the shadows are crushed. I feel like half the frame is hiding texture that ought to be there. I like the dynamic range of modern cameras.

    We didn't "forget" how to "make movies look like movies". We decided that there's a wider range of ways movies can look, and we're intentionally taking advantage of that for creative freedom. And like always, people will disagree over aesthetic choices.

    I totally understand what you mean, though, about lighting vs grading, and where what gets done, but there are good arguments for doing more with grading rather than in the lighting. It ultimately allows the editor+grader+director to make a lot more choices, and that's generally a good thing. You say "color grading in post should be for small tweaks" but I respectfully disagree. And obviously, there isn't even a choice when it comes to the outdoors in daytime -- it has to be done in the grading.

    • I broadly agree with what you're saying. In my post, I was specifically addressing cases where a lack of expressive diversity in looks is a result of the factors discussed - basically the failure mode where color grading becomes a crutch instead of one part of an intentionally crafted look. In non-failure cases, color grading can be fantastically expressive and a key element in the cinematographer's toolbox.

      As I mentioned, the problem is a strange lack of visual diversity in looks. I'm all for increasing artistic and expressive range and I'm not one of those pining for old-school processes. As you said, film had and still has a lot of limitations. Having been involved in both pre-digital film production and analog video production, we had to spend stupid amounts of effort to avoid or overcome the inherent technical limitations we were saddled with. It was incredibly frustrating and I'd spend time dreaming about a future where those technical (and chemical) limitations no longer haunted us. I guess that's why I'm sort of dismayed that so many creators aren't utilizing the truly incredible technical fidelity even consumer gear provides today.

      I should also have mentioned I don't fully agree with every point made in the video I linked but it is a terrific way to highlight that the issue isn't technical limitations of digital production. It's either an explicit creative choice to settle for visual blandness or the result of not making explicit choices leading to an ambient default sameness.

      > We decided that there's a wider range of ways movies can look, and we're intentionally taking advantage of that for creative freedom.

      That's what I find delightful about today's best work. And I'm fine respecting different creative choices, as long as someone actually thought about it and made those choices intentionally because they believed it was the best realization of their unique vision. But it's also true that the range of looks in today's content isn't as wide as it should be. There are still too many productions that suffer from that default blandness due to a lack of creative intention. I just refuse to believe so many DPs really woke up passionately committed to that particular orange/teal palette as the ideal expressive vision for their current project. Or the recent epidemic of 'HDR-flat' desaturation. We can and should strive to do better - to think and create different and deploy the full palette of expression we're so fortunate to finally have at our fingertips. I want to see and celebrate a broad range of expressively unique, creatively opinionated looks - even ones I don't personally care for - whether created in-camera, in grading or even purely in CGI.

      I should also add that there's still an element of technical limitation driving some of this default to visual conservatism. Sadly, inconsistent (and sometimes just broken) HDR implementations across consumer viewing platforms is a frustrating issue and I sympathize with colorists and mastering engineers prepping content for literally 200 different distribution formats. While technically-based, these issues are all the more tragic because there's no underlying reason it had to be such a shit show of uneven implementation. HDR, wide color gamuts and deep color spaces are all well-specified and purely in the digital domain. High-quality digital processing and conversion is inexpensive and built into even cheap HDMI encoder chips so even the cheapest consumer displays with limited capabilities should be able to map content created with higher color spaces and wide dynamic ranges so that they still broadly represent the creator's intent. Yet too many still fail to properly handle mapping HDR and WCG content.

      2 replies →

    • It's hard to take that linked video seriously with the host sequence poorly lit and color graded in that awful blue / orange film cliche.