Comment by MegaDeKay

20 hours ago

I'd say there are two remaining roadblocks. First and biggest is kernel level anti-cheat frameworks as you point out. But there's also no open source HDMI 2.1 implementation allowed by the HDMI cartel so people like me with an AMD card max out at 4K60 even for open source games like Visual Pinball (unless you count an adapter with hacked firmware between the card and the display). NVidia and Intel get away with it because they implement the functionality in their closed source blobs.

This is kind of a niche problem. It only affects people with AMD GPUs running games at over 4k60 with HDMI. Get an NVidia or stay at 60 FPS or stay at 1080p or use DisplayPort and you will be fine.

It is not really a roadblock, more like a bump, and it is not the only bump by far. Some games just don't run on Linux, or quite terribly and they don't have a big enough community for people to care. Sometimes one of your pieces of hardware, maybe an exotic controller, doesn't like linux. Sometimes it is not the fault of the game at all, but you want to do something else with that PC and it isn't supported on Linux, and you don't want to dual boot. Overall, you will have less problems with gaming on Windows, especially if you don't really enjoy a trip to stackoverflow and the command line, but except for anti-cheat maybe, there is no "big" reasons, just a lot of small ones.

And sure, it is improving.

This is the first I learned of this since personally I have no need of anything over 4k@60 (that already borders on absurd in my mind). I'm curious if this is something that's likely to get reverse engineered by the community at large?

Outrageous that a ubiquitous connection protocol is allowed to be encumbered in this way.

  • For the particular use case I mentioned in my earlier post (Visual Pinball), 4k@120 is actually a pretty big deal. We often play on screens 42" and up so the 4k detail is put to good use and makes things like the instruction cards in the corners legible. But the bigger difference is the smoothness in gameplay that 120Hz gets you. The ball travels really fast so 120 Hz helps gameplay a lot while reducing lag at the same time. And because a large chunk of the playfield is static at any one time, you don't need something like a 5090 to hit 120 Hz at that resolution like you might with a triple-A shooter.

Is HDMI really a roadblock to gaming when DisplayPort exists?

  • It's a blocker if you want to use a TV, there are almost 0 TVs with DP. This HDMI licensing crap is also the reason a Steam Deck can't output HDMI > 4K@60 unless you install Windows on it.

  • Up until a year or two ago, the majority of monitors (and graphic cards) used DisplayPort 1.4 and HDMI 2.1. With HDMI 2.1 (42 Gbps) having more bandwidth than the DisplayPort (26 Gbps).

    This is my case with my relatively new/high-end RTX 4080 and OLED monitor. So until I upgrade both, I use HDMI to be able to drive a 1440p 240hz 10-bit HDR signal @ 30 Gbps.

    • I had said I wouldn’t upgrade from my RTX 3080 until I could run “true 4K”.

      I finally got the 240hz 4K uncompressed but it required buying a $1300 Asus OLED monitor and the RTX 5090. It looks amazing though, even with frame gen. Monster Hunter had some particularly breathtaking HDR scenes. I think it uses DisplayPort 2.1? Even finding the cable is difficult, Microcenter didn’t have them in April and the only one that worked was the one that came with the monitor.

  • I want to play games on the same fancy lg tv I use with my consoles. I just checked and it does not appear to have displayport.

Does AMD not support Display Port? I'm not an expert on this, but that sounds to me like the superior technology.

  • TVs don't support displayport, so it makes Linux PCs like the Steam Machine inferior console replacements if you want high refresh rates. A lot of TVs now support 4K/120hz with VRR, the PS5 and Xbox Series X also support those modes.

    (Some games support 120, but it's also used to present a 40hz image in a 120hz container to improve input latency for games that can't hit 60 at high graphics quality.)

    • Why don't TVs support displayport? If HDMI 2.1 support is limited, a TV with displayport sounds like an obvious choice.

      I thought audio might be the reason, for as far as I can tell, displayport supports that too.

      7 replies →

    • Correction, you can get 4K@120hz with HDMI 2.0, but you won't get full chroma 4:4:4, instead 4:2:0 will be forced.

      In my case I have an htpc running linux and a radeon 6600 connected via hdmi to a 4k @ 120hz capable tv, and honestly, at the sitting distance/tv size and using 2x dpi scaling you just can't tell any chroma sub-sampling is happening. It is of course a ginormous problem when on a desktop setting and even worse if you try using 1x dpi scaling.

      What you will lose however is the newer forms of VRR, and it may be unstable with lots of dropouts.

I don’t understand why they can’t support AMDPort 2.1 which coincidentally has the same connector and protocol.