Comment by deathanatos

12 years ago

> Rule 3. Fancy algorithms are slow when n is small, and n is usually small. Fancy algorithms have big constants. Until you know that n is frequently going to be big, don't get fancy. (Even if n does get big, use Rule 2 first.)

Thing is, I don't know; n is definitely small in our unit tests, but if it isn't guaranteed¹ to be small, I'd rather start the code off as something with O(decent) as opposed to O(whatever an array gets me). It's not clear to me if this is what Rob Pike means though: does simply using a hashtable when appropriate to avoid iterating through a list count as "fancy"?

When the ultimate bounds of n aren't certain, I'd rather start with a good O(...) algorithm, and then apply the "measure, and optimize."

¹and even then, if my "guarantee" is coming from something other than physics, I'm wary.

If you can get a naive n^2 solution, or build a complicated nlgn solution, and you know the size of the data will be ~100 items, certainly never >1000 solutions, I'd choose the naive solution.

Edit: and it's not in the middle of a tight loop