← Back to context

Comment by nikanj

5 years ago

The old maxim of "Premature optimization is the root of all evil" has over time evolved to "If you care one iota about performance, you are not a good programmer".

That belief is getting a bit outdated now that computing efficiency is hitting walls. Even when compute is cheaper than development, you're still making a morally suspect choice to pollute the environment over doing useful work if you spend $100k/yr on servers instead of $120k/yr on coding. When time and energy saved are insignificant compared to development expense is of course when you shouldn't be fussing with performance.

I don't think the anti-code-optimization dogma will go away, but good devs already know optimality is multi-dimensional and problem specific, and performance implications are always worth considering. Picking your battles is important, never fighting them nor knowing how is not the trick.

  • I agree 100% - the whole cheery lack of care around optimization to the point of it becoming 'wisdom' could only have happened in the artifice of the huge gains in computing power year on year.

    Still, people applying optimizations that sacrifice maintainability for very little gain or increase bugs are still doing a disservice. People who understand data flow and design systems from the get-go that are optimal are where it's at.

And fwiw, the full quote is:

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.

Yet we should not pass up our opportunities in that critical 3%.

  • Also, it’s not “never optimise”. It’s “only optimise once you’ve identified a bottleneck”. I guess in a profit-making business you only care about bottlenecks that are costing money. Perhaps this one isn’t costing money.

    • This one isn’t measured to be costing money. Trying to figure out how much money you’re losing as a result of performance problems is extremely difficult to do.

    • Not all performance problems have obvious "bottlenecks", some are entirely second-order effects.

      For instance, everyone tells you not to optimize CPU time outside of hotspots, but with memory usage even a brief peak of memory usage in cold code is really bad for system performance because it'll kick everything else out of memory.

    • Precisely. Hasn't GTA Online done over a billion in revenue?

      Given how incredibly successful it's been, it's conceivable the suits decided the opportunity cost of investing man-hours to fix the issue was too high, and that effort would be better spent elsewhere.

      7 replies →

  • Then again, using the correct data structure is not really optimisation. I usually think of premature optimisation as unnecessary effort, but using a hash map isn't it.

    • My belief is that your first goal should be cognitive optimization. I.e. make it simple and clear. That includes using hash maps when that is the proper data structure since that’s is what is called for at a base design level.

      The next step is to optimize away from the cognitively optimal, but only when necessary. So yeah it’s really crazy this was ever a problem at all.

      2 replies →

    • This is the key point: premature optimization would be e.g. denormalising a database because you think you might have a performance problem at some point, and breaking the data model for no good reason.

      Here the wrong data structure has been used in the first place.

    • If it doesn't change semantics, it's optimization.

      It's just a zero cost one, so if anybody appears complaining that you are caring to choose the correct data structure and that's premature (yeah, it once happened to me too), that person is just stupid. But not calling it an optimization will just add confusion and distrust.

    • Exactly. And not only being able to pick the correct data structure for the problem, (and possibly the correct implementation), but being able to know what is going to need attention even before a single line of code is written.

      Most of my optimization work is still done with pencil, paper and pocket calculator. Well, actually, most of the time you won't even need a calculator.

That doesn't really apply here. I don't even play GTA V but the #1 complain I've always heard for the past 6 years is that the load times are the worst thing about the game. Once something is known to be the biggest bottleneck in the enjoyment of your game, it's no longer "premature optimization". The whole point of that saying is that you should first make things, then optimize things that bring the bring the most value. The load time is one of the highest value things you can cut down on. And the fact that these two low hanging fruit made such a big difference tells me they never gave it a single try in the past 6 years.

  • Sure it does apply. These complaints come out after the game has been released. They should have optimized this before they released, while they even designed the system. However that's considered premature optimization, when in fact it's just bad design.

    • > while they even designed the system

      See, that's exactly why you're wrong. This wasn't a bad "design". If the fix required to rebuild the whole thing from scratch, then you would have a point and thinking about it "prematurely" would've been a good idea. In this case, both the fixes were bugs that could've been fixed after the game was finished without having to undo much of the work done.

      The whole point of the saying is that you don't know what's gonna be a bottleneck until after. Yes by optimizing prematurely, you would've caught those two bugs early, but you would've also spent time analyzing a bunch of other things that didn't need to be analyzed. Whereas if you analyze it at the end once people complain about it being slow, you're only spending the minimum amount of time necessary on fixing the things that matter.

      3 replies →

  • We used to start up Quake while we waited then we'd forget about GTAO. Later we'd discover GTA had kicked us out for being idle too long. Then we'd just close it.

    That should be embarrassing for Rockstar but I don't think they would even notice.

The problem here isn't a lack of optimization, it's a lack of profiling. Premature optimization is a problem because you will waste time and create more complex code optimizing in places that don't actually need it, since it's not always intuitive what your biggest contributors to inefficiency are. Instead of optimizing right away, you should profile your code and figure out where you need to optimize. The problem is that they didn't do that.

  • I'd like to add, while GTA is running I'm really impressed by its performance, even / especially when it was on the PS3; you could drive or fly at high speed through the whole level, see for miles and never see a loading screen. It is a really optimized game, and that same level is continued in RDR2.

    Which makes the persisting loading issue all the weirder.

I think this part of the Knuth's quote is central:

> Programmers waste enormous amounts of time thinking about, or worrying about, the speed of noncritical parts of their programs, and these attempts at efficiency actually have a strong negative impact when debugging and maintenance are considered.

And he is explicitly advocating for optimizing the critical part:

> Yet we should not pass up our opportunities in that critical 3%.

And somehow, people have latched onto the catchphrase about "early optimization" that was taken out of context.

I went back and had a look at that maxim a few years ago and found that it actually doesn't say what many people claims it says. And definitively not as the blanket excuse for slow code that it has always to some degree been used as.

The reason for the misunderstanding is that the kinds of practices it actually talks about are uncommon today. People would often take stuff written in higher level languages and reimplement them in assembler or machine code. Which makes them more time-consuming to change/evolve.

It also isn't like it is hard to figure out which part of a piece of software that is taking up your runtime these days. All worthwhile languages have profilers, these are mostly free, so there is zero excuse for not knowing what to optimize. Heck, it isn't all that uncommon for people to run profiling in production.

Also, it isn't like you can't know ahead of time which bits need to be fast. Usually you have some idea so you will know what to benchmark. Long startup times probably won't kill you, but when they are so long that it becomes an UX issue, it wouldn't have killed them to have a look.

Back in the day when people talked about premature optimization it was about trivial things like people asking on stackoverflow whether a++, ++a or a += 1 is faster. It obviously is a net loss since ultimately it literally doesn't matter. If matters to you, you are already an expert in the subject and should just benchmark your code.