← Back to context

Comment by episode404

7 hours ago

>a 10-30% perf drop is good and is a reasonable tradeoff to consider

You are either trolling or completely out of your mind. You simply cannot be serious when saying stuff like this.

I'm not. The situation is improving rapidly, and I'd expect the gap to close soon.

I still have the windows install. And with an RTX 3090, framerate is not that much of a consideration for most games, especially since my main monitor is "only" 1440p, albeit a 144Hz one.

Couple that with GSync, framerate fluctuations is not really noticeable. Gone are the days where dipping below 60Hz is a no-no. The most important metric is stutter and 1% lows, those will really affect the feeling of your game. My TV is 120Hz with GSync too, and couch games with a controller are much less sensitive to framerate.

Do I leave performance on the table? Surely. Do I care? In the short term, no. The last GPU intensive games I played are Hogwarts Legacy and Satisfactory, both of which can take a hit (satisfactory does not max the GPU, and Hogwarts can suffer DLSS). The next intensive game I plan on playing is GTA VI, and by this time I'd fully expect the perf gap to have closed. And the game to play fine, given how Rockstar puts care on how the performance of their games, more so with the Gabe Cube being an actual target.

In the long run, I agree this is not a "happy" compromise. I paid for that hardware dammit. But the NVIDIA situation will be solved by the time I buy a new GPU: either they completely drop out of the gaming business to focus on AI, or they fix their shit because Linux will be an actual gaming market and they can't keep giving the finger to the penguin.

It’s reasonable to consider. If a title runs at 80FPS on Windows, it’ll be completely playable on Linux. Framerate isn’t everything.

It's perfectly reasonable. I actually run my Nvidia card at a 30% underclock so it works out fine for me on Linux.