← Back to context

Comment by jdright

1 day ago

yes, that is a very usual way (known practices) of vendors applying specific optimizations for known things.

It is also part of the benchmarks game they play against each other.

The link is long dead and the Wayback machine doesn’t have a copy.

But in 2001 ATI was caught applying optimizations to Quake 3 when someone realized if you renamed the executable from “quake” to “quack” the score dropped a ton. It was a big scandal.

I know that’s common now but that wasn’t a thing that was done at the time.

  • Was it a scandal at the time? My understanding of how per-game card-driver optimizations work today is:

    1. AAAA Game Studio shits out another unoptimized clunker

    2. nvidia considers it a reputational risk if games run at 30 FPS on a 5090

    3. They go in, look at the perverse ways the game misuses rendering primitives, and then hacks shit in to make whatever bad things they're doing less bad.

    As a gamer, this seems fine to me and i generally blame the AAAA devs for being bad at their jobs or AAAA studio leads for being ok shipping unoptimized messes.

    • > As a gamer, this seems fine to me

      As a software developer, it almost certainly has a bad effect on the ecosystem long term. "Hacks shit in" is the very definition of technical debt, and that has a cost that someone, somewhere is going to have to pay in some form.

      13 replies →

    • A friend of mine developed his own game engine, and what he said is you need to bargain with the nVidia driver, because hardware doesn't perform at its peak when you write everything honoring the spec, and driver feels free to ignore your commands about how you want to do some things (e.g. memory transfers).

      Like board manufacturers, the game developers also need to please the drivers and do the way driver silently dictates to them (regardless of what DirectX, OpenGL or Vulkan says), otherwise all bets are off.

    • I believe the driver silently swapped the textures to lower quality ones that looked worse but gave a performance boost.

    • > Was it a scandal at the time?

      Yes. My understanding was it was optimized by reducing precision or something to a visibly apparent degree.

      It's different if the driver changes things in ways such that rendered output is the same or at least imperceptibly different. I think there's also a lot more communication between gpu makers and game/engine developers these days; plus a lot more frequent updates.

      1 reply →

    • I was surprised to see “AAAA”. I didn’t know there were 4 As now.

      “AAAA Game Studio shits out another unoptimized clunker” seems a paradoxical statement to me. I would have thought “AAAA” meant “highly resourced” game company. Does it just mean high revenue? Lots of players?

      4 replies →

    • it rendered in lower quality, IIRC lower textures / much more aggressive mipmapping and/or LOD

    • Except that if a developer has that kind of market pull, nVidida will gladly help those devs with getting it right. They are excellent at maintaining developer relations.

  • In at least one past version of Windows (circa 1990s), if you tried to replace the default web browser of IE with another choice you were given an Open File dialog window to choose the executable.

    Funny quirk, though: that particular window wouldn't show files named firefox.exe. It would accept that as typed input, if you were at the correct folder, but the file listing omitted that particular file.

    Maybe it was mozilla.exe; it was a long time ago. But that was the discovery that pushed me off IE forever.

    • I vaguely remember that being the start of the browser prompts to set your current browser as the default. It was so hard to just configure that they had to build a way to set it within the browser.

      You saw that again in more modern times when Microsoft removed support for the APIs they provided to set browser defaults, forcing browser makers to write step by step instructions on what to click to set the default browser.

      I believe they walked that back, but it left such a bad taste that I switched my installation of Windows from default mode to EU mode in order to avoid it. And come to think of it, I haven’t used my windows machine for much outside of AI in about 6 months.

      But Microsoft is not alone in these sort of defaults games - every OS or browser maker, Apple, Google, Firefox, wants to create moats so they can more easily monetize your usage of a product. I never thought I’d prefer the business model of free to play games, where they just outright ask you for money and have to keep finding new ways to entertain instead of relying on hard to change defaults and selling your data.

      1 reply →

  • There are bugs that certain games rely on and features that some don’t use. I’m currently trying to optimize a library out of spite. (I want it to work better than the competitor that caused me a lot of problems on a recent project). The amount of conditional logic around what is essentially a function to increment a value is breathtaking.

It’s really strange for established companies to waste their credibility on games like that…

  • I was pretty young at the time, but I recall the market for graphics being a lot wider open at the time Quake was released. Remember 3dfx? They produced the Voodoo series of graphics cards. They're barely a distant memory now.

    Quake was also the standard for a game that was willing to fully exploit the hardware of the time.