← Back to context

Comment by api

13 years ago

I've even seen this mentality in startups. It does have some business rationale, provided you are thinking short term and focused only on near-term goals.

One of the reason businesses have trouble really innovating is that it's hard in a business to work on long-term things when markets are very short sighted. Only mega-corps, monopolies, and governments can usually do that... or hobbyists / lifestyle businesses who are more casual about hard business demands.

That being said, MS is surely cash-rich enough to think long term. So this doesn't apply as much here.

I've also found that of all things optimization almost gets you looked down upon in most teams -- even young ones. "Premature optimization is the root of all evil," and all that, which is usually misinterpreted as "optimization is naive and a waste of time." It's seen as indicative of an amateur or someone who isn't goal-focused. If you comment "optimized X" in a commit, you're likely to get mocked or reprimanded.

In reality, "premature optimization is the root of all evil" is advice given to new programmers so they don't waste time dinking around with micro-optimizations instead of thinking about algorithms, data structures, and higher order reasoning. (Or worse, muddying their code up to make it "fast.") Good optimization is actually a high-skill thing. It requires deep knowledge of internals, ability to really comprehend profiling, and precisely the kind of higher-order algorithmic reasoning you want in good developers. Most good optimizations are algorithmic improvements, not micro-optimizations. Even good micro-optimization requires deep knowledge-- like understanding how pipelines and branch prediction and caches work. To micro-optimize well you've got to understand soup-to-nuts everything that happens when your code is compiled and run.

Personally I think speed is really important. As a customer I know that slow sites, slow apps, and slow server code can be a reason for me to stop using a product. Even if the speed difference doesn't impact things much, a faster "smoother" piece of code will convey a sense of quality. Slow code that kerchunks around "feels" inferior, like I can see the awful mess it must be inside. It's sort of like how luxury car engines are expected to "purr."

An example: before I learned it and realized what an innovative paradigm shift it was, speed is what sold me on git. The first time I did a git merge on a huge project I was like "whoa, it's done already?" SVN would have been kerchunking forever. It wasn't that the speed mattered that much. It was that the speed communicated to me "this thing is the product of a very good programmer who took their craft very seriously as they wrote it." It told me to expect quality.

Another example: I tried Google Drive, but uninstalled it after a day. It used too much CPU. In this case it actually mattered -- on a laptop this shortens battery life and my battery life noticeably declined. This was a while ago, but I have not been motivated to try it again. The slowness told me "this was a quick hack, not a priority." I use DropBox because their client barely uses the CPU at all, even when I modify a lot of files. Google Drive gives me more storage, but I'm not content to sacrifice an hour of battery life for that.

(Side note: on mobile devices, CPU efficiency has a much more rigid cost function. Each cycle costs battery.)

Speed is a stealth attribute too. Customers will almost never bring it up in a survey or a focus group unless it impacts their business. So it never becomes a business priority.

Edit: relevant: http://ubiquity.acm.org/article.cfm?id=1513451

> In reality, "premature optimization is the root of all evil" is advice given to new programmers so they don't waste time dinking around with micro-optimizations instead of thinking about algorithms, data structures, and higher order reasoning. (Or worse, muddying their code up to make it "fast.")

It's also for experienced programmers who dink around with macro-optimizations. For example, designing an entire application to be serializable-multi-threaded-contract-based when there's only a handful of calls going through the system. Or creating an abstract-database-driven-xml-based UI framework to automate the creation of tabular data when you have under a dozen tables in the application.

premature optimization is the root of all evil is a really really important mindset, and I agree it doesn't mean to not optimize, and many developers seem to take it that way.

X+1 = How many transactions your business does today

Y = How many transactions your business needs to do in order to survive

Y/X = What the current application needs to scale to in order to simply survive. This is the number where people start receiving paychecks.

(Y/X)4 = How far the current application needs to scale in order to grow.

The goal should be to build an application that can just barely reach (Y/X)4 - this means building unit tests that test the application under a load of (Y/X)4 and optimizing for (Y/X)4

Spending time trying to reach (Y/X)20 or (Y/X)100 is what I'd call premature optimization.

Disclaimer: (Y/X)4 is no real point of data that I know of, just something I pulled out as an example, anyone who knows of actual metrics used please feel free to correct.

  • > > In reality, "premature optimization is the root of all evil" is advice given to new programmers

    > It's also for experienced programmers who dink around with macro-optimizations.

    The hilarious thing is that everyone seems to think everyone else's optimizations are premature.

    Don't believe me? Just go ahead and ask someone an optimization question, like maybe on StackOverflow. Those who respond might not know you, but I'll be damned if their first reply isn't to tell you that you probably shouldn't even be optimizing in the first place, because obviously if you'd thought about it more carefully, you'd have realized it's premature.

  • The canonical version is Alan J. Perlis Epigram 21:

       'Optimization hinders evolution.'
    

    If you have a black box, then optimize the fuck out of it. The Windows kernel is not a black box.

    • I think you missed half the argument. Windows is noticeably slower than Linux or Mac on the same hardware. Isn't that a problem?

      And if optimization always hinders evolution, boy should Windows be evolving... I mean... the NT kernel should have smashed through all kinds of antiquated paradigms by now. It should be doing memory deduplication, disk deduplication, fast JIT compilation of binaries for alternative architectures. It should support live process migration between machines, joining of systems together efficiency to form larger super-systems, a better permission model obviating the need to rely completely on virtualization for true privilege isolation in enterprise environments. It should have truly efficient network filesystems supporting disconnected operation, sharding, etc.

      Oh wait... it's stuck in the 90s... never mind. And it's slow.

      Linux, which optimizes a lot, has at least some of the things I mentioned above.

      "Premature optimization is the root of all evil" is a deeply nuanced statement that is nearly always misunderstood. "Optimization hinders evolution" is probably likewise. They're quotes cherry-picked out of context from the minds of great craftsmen who deeply understand their craft, and alone I do not believe they carry the full context required to comprehend what they really mean. I think they have more to do with maintaining clarity and focusing on higher-order reasoning than they do with whether or not to try to make things run faster. (And like I said, the most effective optimizations are usually higher-order conceptual algorithmic improvements.)

      1 reply →

    • I recently realized, e.g., that the HotSpot JVM is the R.M.S. Titanic, and invokedynamic (or any change in bytecode) is the iceberg. That's probably a reason why we're still waiting for lambdas, in-memory transactions, etc. It's too large to evolve quickly.

      1 reply →

  • "Or creating an abstract-database-driven-xml-based UI framework to automate the creation of tabular data when you have under a dozen tables in the application."

    I'm not an experienced programmer, and I'd use that approach anyway, simply because it makes sense (is there any alternative that isn't worse?). What does that mean? (Oh, and I'd skip XML. I abhor violence.)

    • Factories only make economic sense if you are planning on building a large number of widgets.

      For small applications, dynamic form generation infrastructure dwarfs the actual business logic. It means writing a lot of code which isn't solving your business problem.

      Your project has an extra layer of 'meta'. It's harder to debug. It decreases flexibility. Validation is hard for multi-field relationships. Special-casing that one form can require major infrastructure changes. The approach tends to be slower and buggier than the naive approach for all but the most form-heavy applications.

      4 replies →

    • For a small number of tables?

      Just bite the bullet and render them by hand. It'll take less time than writing the abstract whatever-driven autogenerated UI framework, making it lay things out nicely, handle all the corner cases and auto-wiring of stuff which might not be needed in all cases, and tweaking the generated code to look good for all the data.

      1 reply →

  • i admit to doing this... made a multithreaded simulation framework for use by my game team.... too bad the games they make with are so computationally simple as to make the sim framework a complete waste :(

The concept of "premature optimization" also has another connotation in product development: Don't waste too much time making that product or feature optimized until you are convinced you can actually sell it. It's not that optimization is bad, but optimization before market trial (premature) can result in you spending precious time working hard on the wrong thing.

Optimizing the right thing is good, but figure out what that thing is first.

  • In a publicly traded company, such as Microsoft, that doesn't take long. It is shareholder value. It is not Optimization "for its own sake" promoted in the article. I savvy developer made need to run business tests against their software.

    • > It is shareholder value.

      But in what timeframe? The article seems to imply they are thinking only about short term value and ignoring medium and long term.

      7 replies →