← Back to context

Comment by rictic

5 years ago

True that it's rare that you need to pull out obscure algorithms or data structures, but in many projects you'll be _constantly_ constructing, composing, and walking data structures, and it only takes one or two places that are accidentally quadratic to make something that should take milliseconds take minutes.

The mindset of constantly considering the big-O category of the code you're writing and reviewing pays off big. And neglecting it costs big as well.

Except that you need to test your software and if you see performance problems, profile them to identify the cause. It's not like you have one single chance to get everything right.

  • The later in development a problem is caught, the more expensive it is. The farther it gets along the pipeline of concept -> prototype -> testing -> commit -> production, the longer it's going to take to notice, repro, identify the responsible code, and fix.

    It's true that you don't just have one shot to get it right, but you can't afford to be littering the codebase with accidentally quadratic algorithms.

    I fairly regularly encounter code that performed all right when it was written, then something went from X0 cases to X000 cases and now this bit of N^2 code is taking minutes when it should take milliseconds.

People complain about Big-O once they reach the end of its usefulness. Your algorithm is O(n) or O(n log n) but it is still too slow.