Comment by 0rdinal
3 months ago
I daily drive Ubuntu, the user experience is comparable (in many cases better) to Windows 11. The only sticking point for me is display drivers. HDR on Wayland is barely functional (in my experience), and getting things like hardware accelerated AV1 encoding, full Vulkan API support etc to work has been extremely difficult. Every time I login using a Wayland desktop, only my main monitor is detected and it defaults to 60hz. I have to go through a whole process of unplugging the "undetected" monitors and plugging them back in. X11 doesn't suffer from this, but of course does not support HDR.
Yes, this is almost entirely Nvidia's fault, and yes I should know better than to use NV graphics cards on Linux distros; but frankly, the barrier to entry should not be having to replace an expensive piece of hardware to achieve feature parity. (Obligatory "Nvidia, f*k you!")
> Every time I login using a Wayland desktop, only my main monitor is detected and it defaults to 60hz. I have to go through a whole process of unplugging the "undetected" monitors and plugging them back in.
Are you using GNOME? mutter has this problem where it does not retry commit on the next CRTC: https://gitlab.gnome.org/GNOME/mutter/-/issues/3833. If this is actually what's happening on your system, switching to KDE should solve it.
> HDR on Wayland is barely functional (in my experience)
This also sounds specific to GNOME, as mutter still doesn't have color management. You'll get a better HDR experience with KDE.
GNOME is typically the worst of all the options if you need feature support. They aggressively nack wayland proposals, and subsequently don't implement those proposals - while almost the entirety of the ecosystem does.
Seriously, it's bizarre to me how aggressively they pushed to use Wayland but then hold it back like that.
> This also sounds specific to GNOME, as mutter still doesn't have color management.
Gnome 49 should've solved that. [0]
[0] https://gitlab.gnome.org/GNOME/mutter/-/merge_requests/4102
I don't think so. I'm on GNOME 49 and nothing has changed compared to 48.
GNOME has both color management and color representation protocols implemented. HDR works fine on it
No, having the bare minimum "HDR support" does not mean it works fine. I have a 27-inch 4K 144Hz monitor with P3 wide color gamut and HDR600. This monitor is connected to 2 PCs, one running Arch Linux with GNOME as the DE and one with Windows 11.
Since Windows 11 24H2, with the new color management feature turned on, I can get correct colors on the monitor in both SDR and HDR modes. So it ends up with HDR on at all times, and mpv can play HDR videos with no color or brightness issues.
GNOME, on the other hand, is stuck with sRGB output in SDR mode, so you get oversaturated colors. With HDR on, SDR content will no longer be oversaturated, but if you play HDR videos with mpv, the image looks darkened and wrong. I've tried setting target-peak and target-contrast to match the auto-detected values on Windows, but the video still looks off.
1 reply →
In my experience, hardware support with drivers is far better with Ubuntu than with any of the 'consumer operating systems'. Display drivers, Nvidia in particular, have been a problem though, which I avoid by just going for integrated graphics (Intel). This worked well since I don't play games, however, then I got into Blender, which really needs a proper GPU (with drivers).
This summer I tried to interest a relative in using a Wacom tablet on their Apple computer. In linux-world you just plug the thing in and the job is done. Yet on the Apple computer I was having to hunt down drivers and install stuff, taking me out of my comfort zone. We didn't get the Wacom tablet to work (it is a decade old) and gave up.
All operating systems will inevitably force their ways of working on you to some extent and it is 'better the devil you know' for most people, myself included. My first OS that 'didn't get in the way' of what I wanted to do was SGI Irix. I think Ubuntu has that aspect of not getting in the way, however, I am confidently able to use the command line to type in installation instructions. Text instructions for installing stuff is brilliant since you can reproduce results consistently with not much more than 'cut and paste' needed. As soon as you move to a consumer OS then this becomes murkier, particularly if you have to use things like 'Homebrew'. An Apple user will quibble with me that this is difficult, but each to their own.
Along the way I have invariably kept the standard Windows installation, to never use it, ever. I thought I would need dual boot to hop into Photoshop, Word or some other Windows application, however, this has proven to not be the case.
The time has come for me to delete those Windows partitions and get my disk space back. In so doing I will also be excluding myself from any of those AI integrations that must be polluting Windows these days.
All of my problem was solved by disabling hybrid graphics and use the dedicated card only. I had not a single bug since then on X11 (I didn’t try Wayland yet, because it was almost completely unusable with hybrid config). The only drawback is battery life, but that wasn’t great even before. I could never reach the ~4 hours, which was possible with Windows. Even with the dedicated card disabled. So, I’m not entirely sure that it’s entirely on Nvidia.
Same, on my laptop. Hybrid graphics destabilised both Debian (with Nvidia drivers installed) and the Windows 11 installation I have on there for SharpCap. Switching to Nvidia GPU only made everything rock solid.
This was my first experience with hybrid graphics, and so far I'm not impressed.
Hybrid graphics had troubles last spring, but in my case it was fixed around late July. I still launch steam with weird env variables (i don't often change my shortcuts), but i'm not sure it's needed.
Are you using a ThinkPad? My work laptop has this issue too, on windows 11. 75% of the time I have to unplug the monitor after waking up the laptop. 20% of the time it works. 5% of the time it has 640x480 resolution, and I have to unplug it again.
HDR is unusable on Windows too. I finally decided to sell my HDR monitor after like a year because it was a massive pain in the ass from the moment I bought it. One of my biggest wastes of money ever.