Comment by kderbe
13 hours ago
Steve Burke from GamersNexus tested eight games from their benchmark suite on Linux last month. Although his conclusion was generally positive, there were problems with nearly every game:
- F1 2024 didn't load due to anti-cheat
- Dragon's Dogma 2 and Resident Evil 4 had non-functional raytracing
- Cyberpunk 2077 with raytracing on consistently crashes when reloading a save game
- Dying Light 2 occasionally freezes for a whole minute
- Starfield takes 25 minutes to compile shaders on first run, and framerates for Nvidia are halved compared to Windows
- Black Myth: Wukong judders badly on Nvidia cards
- Baldur's Gate 3 Linux build is a slideshow on Nvidia cards, and the Windows build fails for some AMD cards
If you research these games in discussion forums, you can find some configuration tweaks which might fix the issues. ProtonDB's rating is not a perfect indicator (BM:W is rated "platinum").
And while Steve says measurements from Linux and Windows are not directly comparable, I did so anyway and saw that Linux suffers a 10-30% drop in average FPS across the board when compared to Windows, depending on the game and video card.
AFAIK this comes down a lot to NVIDIA not doing enough efforts for the Linux drivers. There is a pretty well documented and understood reason for the perf hit NVIDIA GPUs get on Linux.
Honestly, considering where we came from, a 10-30% perf drop is good and is a reasonable tradeoff to consider. Especially for all the people that don't want to touch Windows 11 with a 11-foot pole (which I am), it's a more than decent path. I can reboot into my unsupported Win10 install if I really need the frames.
Really, Linux benchmarks need to be split between AMD and NVIDIA. Both are useful, as the "just buy an amd card lol" crowd is ignoring the actually large NVIDIA install base, and it's not like I'm gonna swap out my RTX 3090 to go Linux.
Thanks for the comparison! Would you have an apples to apples, or rather an NVIDIA to NVIDIA comparison instead of "across the board"? I'd suspect the numbers are worse for the pure NVIDIA comparison, for what I mentioned above.
>a 10-30% perf drop is good and is a reasonable tradeoff to consider
You are either trolling or completely out of your mind. You simply cannot be serious when saying stuff like this.
I'm not. The situation is improving rapidly, and I'd expect the gap to close soon.
I still have the windows install. And with an RTX 3090, framerate is not that much of a consideration for most games, especially since my main monitor is "only" 1440p, albeit a 144Hz one.
Couple that with GSync, framerate fluctuations is not really noticeable. Gone are the days where dipping below 60Hz is a no-no. The most important metric is stutter and 1% lows, those will really affect the feeling of your game. My TV is 120Hz with GSync too, and couch games with a controller are much less sensitive to framerate.
Do I leave performance on the table? Surely. Do I care? In the short term, no. The last GPU intensive games I played are Hogwarts Legacy and Satisfactory, both of which can take a hit (satisfactory does not max the GPU, and Hogwarts can suffer DLSS). The next intensive game I plan on playing is GTA VI, and by this time I'd fully expect the perf gap to have closed. And the game to play fine, given how Rockstar puts care on how the performance of their games, more so with the Gabe Cube being an actual target.
In the long run, I agree this is not a "happy" compromise. I paid for that hardware dammit. But the NVIDIA situation will be solved by the time I buy a new GPU: either they completely drop out of the gaming business to focus on AI, or they fix their shit because Linux will be an actual gaming market and they can't keep giving the finger to the penguin.
It’s reasonable to consider. If a title runs at 80FPS on Windows, it’ll be completely playable on Linux. Framerate isn’t everything.
It's perfectly reasonable. I actually run my Nvidia card at a 30% underclock so it works out fine for me on Linux.
To each their own, but Windows 11 runs flawlessly on my machine with high-end specs and a 240 Hz monitor.
The Start menu works great with no lag, even immediately after booting.
The only thing that I consider annoying would be the 'Setup' screens that sometimes show up after bigger updates.
---
Would I trade it all to get on Bazzite DX:
- lower game compatibility and potential bugs
- subpar NVIDIA drivers with the risk of performance degradation
- restricted development in dev containers relying on VS Code Remote
- Loss of the Backblaze Unlimited plan
+ system rollbacks if an update fails
---
That does not seem worth it to me.
The start menu worked 30 years ago on a 32mb of RAM and a box of scraps.
> The Start menu works great with no lag, even immediately after booting.
The very fact that this has to be explicitly mentioned is laughable.
Like $100 Chinese phones can achieve the same, this is the bare basic for a modern system capable of running 240Hz monitor (I assume it can do so with most games).
1 reply →
Considering I found the win10 start menu too slow, the w11 one does not stand a chance. But I'm hopeful from your comment, it shows that w11 is not the complete shitshow people make it to be, though the few times I used it on relatives computers I found it not responsive enough.
I'm testing daily-drive on my main rig (high-end from a few years ago, 5900x + 3090), and honestly I'm rediscovering my computer. A combination of less fluff, less animations, better fs performance (NTFS on NVMe is suboptimal), etc. I was getting fed up by a few windows quirks: weird updates breaking stuff, weird audio issues (e.g. the audio subsystem getting ~10s latency for any interaction like playing a new media file or opening the output switcher), weird display issues (computer locking up when powering on/off my 4k tv), and whatnot. I'm still keeping the w10 install around, as having an unsupported OS is less of a problem for the occasional game, especially since I mostly play offline games.
As for the dev env, you're not limited to bazzite, I run Arch. Well, I've been running it for two weeks on the rig. But you really get the best devex with linux.
3 replies →
> Baldur's Gate 3 Linux build is a slideshow on Nvidia cards
I played Baldur's Gate 3 on Linux on a GeForce GTX 1060 (which is almost 10 years old!) without a fan (I found later that it was broken) and I generally did not have issues (couple of times in the whole game slowed for couple of seconds, but nothing major).
The key word was Linux build. There's now an official Linux version so that BG3 runs better on Steam Deck. Everyone else should keep using Proton to run it like they've done this far.
Which applies to all the games, basically. I nowadays make sure to select Proton before even running the game for the first time, in case it has a Linux build -- that will invariably be the buggier experience so want to avoid it.
Thats the whole problem. No consistency. Some configurations work, others not - eventhough they should be way more capable.
That's not even limited to linux or gaming. A few weeks ago i tried to apply the latest Windows update to my 2018 lenovo thinkpad. It complained about insufficient space (had 20GB free). I then used a usb as swap (required by windows) and tried to install the update. Gave up after 1 hour without progress...
Hardware+OS really seems unfixable in some cases. I'm 100% getting a macbook next time. At least with Apple I can schedule a support appointment.
For gaming macOS does not seem a great choice. I have friends with macOS and, at least on Steam, there are very few games running on that platform.
Additionally when I was using macOS for work, I had also some unexpected things if I wanted to use anything a bit more special (think packages installed using homebrew, compiling a thing from source, etc.).
So for me the options are: either use a locked device where you can't do anything other than what the designers thought of and if you are lucky it will be good OR use something where you have complete freedom and take the responsibility to tweak when things dont'work. MacOS tries to be the first option (but in my opinion does not succeed as much as it claims to), Linux is the second option (but it is harder than it could be in many cases) and Windows tries to do both (and is worse than the two other alternatives)
It's a CPU bound game
> Baldur's Gate 3 Linux build is a slideshow on Nvidia cards
Not at all my experience which makes me question the rest. Also https://www.protondb.com/app/1086940 most people seem quite happy with it so it's not a "me" problem.
Finally the "10-30% drop in average FPS across the board" might be correct, then so what? I understand a LOT of gamers want to have "the best" performance for what they paid good money for but pretty much NO game becomes less fun with even a 30% FPS drop, you just adjust the settings and go play. I think a lot of gamers do get confused and consider maximizing performances itself as a game. It might be fun, and that's 100% OK, but it's also NOT what playing an actual game is about.
Those are mostly reports for the Windows build of Baldur's Gate 3, running through Proton/Wine. He's talking about the newer Linux native build of the game from 3 months ago.
There's a few reports there for the native version of the game: https://www.protondb.com/app/1086940#9GT638Fuyx , with similar Nvidia GPU issues and a fix.
> pretty much NO game becomes less fun with even a 30% FPS drop
I mostly play fighting games. A 7% drop in FPS is more than enough to break the whole game experience as combo rely on frame data. For example Street Fighter 6 is locked at 60 fps. A low punch needs 4 frames to launch and leaves a 4-frames window to land another hit. If there was a 7% drop in FPS, you would miss your combo. Even the tiniest drop in FPS makes the game unplayable.
It's the same for almost every fighting games. I know it's a niche genre, but I'm quite sure it's the same for other genres. It's a complete dealbreaker for competitive play.
I played competitive Quake on LAN and online. If your setup, hardware/software, can't handle your configuration you either get a better one (spending money, rollback your OS, etc) or adjust it (lower your configuration, nobody plays competitive gaming for the aesthetics, Quake in such a context is damn ugly and nobody cares).
It's not about a drop in game, it's about being prepared for the game. If you get a 7% drop, or even a .1% drop (whatever is noticeable to you) then you adjust.
To be clear I'm not saying worst performance is OK, I'm saying everybody wants 500FPS for $1 hardware but nobody gets that. Consequently we get a compromise, e.g. pay $2000 for 60FPS and so be it. If you have to pay $2000 + $600 or lower graphics settings to still get 60FPS that's what you do.
PS: FWIW competitive gaming is niche in gaming. Most people might want to compete but in practice most people are not, at least not professionally. It's still an important use case but it's not the majority. Also from my own personal experience I didn't get performance drop.
> It's a complete dealbreaker for competitive play
Very true, and this is the biggest issue for me when it comes to gaming on Linux. And it's not just raw FPS count. You can usually brute force your way around that with better hardware. (I'm guessing you could probably get a locked 60 in Street Fighter 6 even with a 30% performance loss?). It's things like input lag and stutter, which in my experience is almost impossible to resolve.
If it weren't for competitive shooters, I could probably go all Linux. But for now I still need to switch over to Windows for that.
You're talking about the Proton version, parent was talking about the Linux native build that is optimized for Steam Deck.
[dead]