Comment by bitanarch

1 day ago

HDR still doesn't really work on Linux w/ nVidia GPUs.

1. 10bpp color depth is not supported on RGB monitors, which are the majority of LCD displays on the market. Concretely, ARGB2101010 and XRGB2101010 modes are not supported by current nVidia Linux drivers - the drivers only offer ABGR2101010 and XBGR2101010 (See: https://github.com/NVIDIA/open-gpu-kernel-modules/blob/main/...).

2. Common browsers like Chrome and Firefox has no real support for HDR video playback on nVidia Linux drivers. The "HDR" option appears on YouTube, but no HDR color can be displayed with an nVidia GPU.

Also, video backgrounds in Google Meet on Chrome are broken with nVidia GPUs and Wayland. Ironically it works on Firefox. This has been broken for a few years and no fix is in sight.

The "HDR" toggle you get on Plasma or Mutter is hiding a ton of problems behind the scenes. If you only have 8bpp, even if you can find an app that somehow displays HDR colors on nVidia/Wayland - you'll see artifacts on color gradients.

I have Interstellar on 4K UltraHD Blu-ray that features HDR on the cover, Sony 4K Blu-ray player (UBP-X700) and a LG G4 OLED television. I also have an AVR (Denon AVR-S760H 7.2 Ch) connecting both the Blu-ray and a PC running Linux with a RTX 3060 12GB graphic card to the television. I've been meaning to compare HDR on Linux with the Blu-ray. I guess now better than never. I'll reply back to my post after I am done.

  • Try it with different monitors you have. The current nVidia Linux drivers only has BGR output for 10bpp, which works on TVs and OLEDs but not most LCDs monitors.

    My monitors (InnoCN 27M2V and Cooler Master GP27U) require RGB input, which means it's limited to 8bpp even with HDR enabled on Wayland. There's another commentator below who uses a Dell monitor and manages to get BGR input working and full HDR in nVidia/Linux.

  • Television HDR mode is set to FILMMAKER, OLED brightness 100%, Energy Saving Mode is off. Connected to AVR with HDMI cable that says 8K.

      PC has Manjaro Linux with RTX 3060 12GB
    
      Graphic card driver: Nvidia 580.119.02
    
      KDE Plasma Version 6.5.4
    
      KDE Frameworks Version: 6.21.0
    
      Qt Version: 6.10.1
    
      Kernel Version 6.12.63-1-MANJARO
    
      Graphics Platform: Wayland
    

    Display Configuration

      High Dynamic Range: Enable HDR is checked
    
      There is a button for brightness calibration that I used for adjustment.
    
      Color accuracy: Prefer color accuracy
    
      sRGB color intensity: This seems to do nothing (even after apply). I've set it to 0%.
      Brightness: 100%
    

    TV is reporting HDR signal.

    AVR is reporting...

      Resolution: 4KA VRR
    
      HDR: HDR10
    
      Color Space RGB /BT.2020
    
      Pixel Depth: 10bits
    
      FRL Rate 24Gbps
    

    I compared Interstellar 19s into Youtube video in three different ways on Linux and 2:07:26 on Blu-ray.

    For Firefox 146.0.1 by default there is no HDR option on Youtube. 4K video clearly doesn't have HDR. I enabled HDR in firefox by going to about:config and setting the following to true: gfx.wayland.hdr, gfx.wayland.hdr.force-enabled, gfx.webrender.compositor.force-enabled. Color look completely washed out.

    For Chromium 143.0.7499.169 HDR enabled by default. This looks like HDR.

    I downloaded the HDR video from Youtube and played it using MPV v0.40.0-dirty with settings --vo=gpu-next --gpu-api=vulkan --gpu-context=waylandvk. Without these settings the video seems a little too bright like the Chromium playback. This was the best playback of the three on Linux.

    On the Blu-ray the HDR is Dolby Vision according to both the TV and the AVR. The AVR is reporting...

      Resolution: 4k24
    
      HDR: Dolby Vision
    
      Color Space: RGB
    
      Pixel Depth 8bits
    
      FRL Rate: no info
    

    ...I looked into this and apparently Dolby Vision uses RGB tunneling for its high-bit-depth (12-bit) YCbCr 4:2:2 data. The Blu-ray looks like it has the same brightness range but the color of the explosion (2:07:26) seems richer compared to the best playback on Linux (19s).

    I would say the colors over all look better on the Blu-ray.

    I might be able to calibrate it better if the sRGB color setting worked in the display configuration. Also I think my brightness setting is too high compared to the Blu-ray. I'll play around with it more once the sRGB color setting is fixed.

    *Edit: Sorry Hacker News has completely changed the format of my text.

    • I don't think the Interstellar Blu-ray has Dolby Vision (or Dolby Atmos), just regular HDR10. If the TV/AVR says it's Dolby Vision something in your setup might be doing some kind of upconversion.

I find that running HDR games in standalone steam gamescope works great for my OLED tv. Not perfect, but great.

nvidia

  • Right, it IS nvidia's fault at this point, but its still like what? 90% of the consumer GPU market.

    • Funny how it went from "just get an Nvidia card for Linux" and "oh my god, what did I do to deserve fglrx?" to "just get an AMD card" and "it's Nvidia, what did you expect?"

    • They're also selling $3000 nVidia AI workstations that exclusively uses Linux. But what if you want to watch an HDR video on it? No. What if you want to use Google Meet on Chrome/Wayland? It's broken.

    • For aftermarket purchase sure, but 95% of consumer machines are using either Intel or AMD integrated graphics.

I don't think this is true. I can go into my display settings in kde plasma and enable HDR and configure the brightness. I have a nvidia blackwell card.

  • You can enable, yes. But (assuming you're on an LCD display and not an OLED), you're likely still on XRGB8888 - i.e. 8-bit per channel. Check `drm_info`.

    Also, go to YouTube and play this video: https://www.youtube.com/watch?v=onVhbeY7nLM

    Do it once on "HDR" on Linux, and then on Windows. The "HDR" in nVidia/Linux is fake.

    The brightness you see on Plasma or Mutter is indeed related to the HDR support in the driver. But - it's not really useful for the most common HDR tasks at the moment.

    • I asked claude to investigate:

        Your Display Configuration
      
        Both monitors are outputting 10-bit color using the ABGR2101010 pixel format.
      
        | Monitor                | Connector | Format      | Color Depth | HDR          | Colorspace |
        |------------------------|-----------|-------------|-------------|--------------|------------|
        | Dell U2725QE (XXXXXXX) | HDMI-A-1  | ABGR2101010 | 10-bit      | Enabled (PQ) | BT2020_RGB |
        | Dell U2725QE (XXXXXXX) | HDMI-A-2  | ABGR2101010 | 10-bit      | Disabled     | Default    |
      
      

      * Changed the serial numbers to XXXXXXX

      I am on Wayland and outputting via HDMI 2.1 if that helps.

      EDIT: Claude explained how it determined this with drm_info, and manually verified it:

      > Planes 0 and 3 are the primary planes (type=1) for CRTCs 62 and 81 respectively - these are what actually display your desktop content. The Format: field shows the pixel format of the currently attached framebuffer.

      EDIT: Also note that I am slowbanned on this site, so may not be able to respond for a bit.

      EDIT: You should try connecting with HDMI 2.1 (you will need a 8k HDMI cable or it will fall back to older standards instead of FRL).

      EDIT: HDR on youtube appears to work for me. Youtube correctly indentifies HDR on only 1 of my monitors and I can see a big difference in the flames between them on this scene: https://www.youtube.com/watch?v=WjJWvAhNq34

      7 replies →

    • It's not obvious how to interpret the output. I pasted it into chatgpt and it thinks I am using "Format: ABGR2101010" for both monitors (only 1 has HDR on) so I don't trust it.

      EDIT: See my sibling comment.

      1 reply →