Comment by eikenberry
20 hours ago
AMD. The final holdout, HDMI 2.1 support being blocked by the HDMI group, has been overcome w/ the HDMI group relenting and support is now landing in the kernel (expected in 7.2).
https://www.gamingonlinux.com/2026/05/further-expanded-amd-h...
I sort of figured that HDMI stupidity was strategically a good thing as it sort of brought the dynamic of the HDMI consortium and VESA. specifically how they treat the end users, more to the public eye.
That is, more people being subtly pushed to using display port is not a bad thing.
I was faintly surprised that my recent monitor purchase came with a displayport cable.
Didn't help connecting it to my Macbook, but still..
DisplayPort has been running the best PC high end monitors for a long while. HDMI OTOH has been in A/V land (DRM management).
Don't most monitors ship with DisplayPort cables? All of mine have. HDMI is more popular with TVs/home theater systems.
1 reply →
I didn't follow this story much: how exactly did they get past the legal hurdles? Or there never actually were any hurdles, just sabre rattling?
Purely rumor, but supposedly Valve put tons of pressure on them (no idea by what means, again this is all rumor) because they wanted support for the Steam Machine release.
any reason why we are using hdmi over display port?
Unless you're on the absolute newest stuff with DisplayPort 2.1, HDMI 2.1 has more bandwidth than DP1.4. That'll be Nvidias 2000 through 4000 series. No DisplayPort 2.1 until the RTX 5000s.
And then monitors released during this time generally do the same too.
Also if you want to use it through a capture card, HDMI ones are way more common and cheaper
AMD Radeon 7000 and 9000 series all support DisplayPort 2.1
The vast majority of the TVs only come with HDMI .. not even good enough analog inputs anymore..
I have been told (but not confirmed) that is mandated by the HDMI mob. If you want HDMI on your TV, it cannot also have DP.
4 replies →
What really drives me nuts is smart TVs with 100mbps Ethernet connections. When I bought a tv we looked in vain for gigabit Ethernet.
10 replies →
Some people have TVs or displays that only use HDMI. I personally wouldn't recommend HDMI if DisplayPort is available, but if HDMI is your only option, then having it work properly will be important.
My monitor has 1 displayport and 2 hdmi and I have 2 computers I use with it. They can't share the displayport. All comparable monitors (last time I checked) have the same. So it'd be nice if both worked.
For one, DisplayPort doesn’t support HDR output
That can't be right. I'm reading this comment on an HDR monitor over DP right now.
Don't all USB-C video outputs use DP alt mode too, with an HDMI adapter at the end? And they can do HDR.
1 reply →
The cable length limitations are also a pain in the ass for not-uncommon A/V system configurations. 6' recommended max, and the best you might get working stably if the device and cable gods smile on you is 15'. 6' is the lower edge of acceptable for just about any A/V system setup (in practice it means your devices need to be within about a meter of the screen's port[s], which is pretty close) and even 15' is still too short to be useful for, say, a projector, or a "the A/V receiver or HDMI switch is over in that cabinet, the TV is on this wall across the room" situation.
HDMI goes 25'+, no problem.
2 replies →
displayport has supported HDR10 since 2016
and displayport 2.0, since 2019, has supported all the same variations (hdr10+, dolby vision) that HDMI does
Do you mean in practice, or something? DP definitely supports HDR, and it seems to work fine for me.
Confidently incorrect.
My main monitor is 4K 240 hz HDR and it works great on my DisplayPort cable, especially the HDR.
This seems wrong to me? I use it to do so every day.
If true, not supporting HDR is a feature