Comment by danielmarkbruce
5 hours ago
This very much depends on where you work... and basically isn't true for most people. It's extremely true for some people.
5 hours ago
This very much depends on where you work... and basically isn't true for most people. It's extremely true for some people.
Rule 3 is still very much real. Fancy fast algorithms often have other trade-offs. The best algorithm for the job is the one that meets all requirements well... Big-O is one aspect, data is another, determinism of the underlying things that are needed (dynamic memory allocation, etc) can be another.
It is important to remember that the art of sw engineering (like all engineering) lives in a balance between all these different requirements; not just in OPTIMIZE BIG-O.
Sure but the default (and usually correct) assumption when working at google (as an example) is basically "all numbers are big", so you you have to cluey about algorithms and data structures and not default to brute forcing something.
At 99% of shops it should be the other way around .
Even when you are working with large numbers, most numbers are usually small. Most of the code is probably not dealing with the large things, and a large thing may consist of a large number of instances that are individually small.
I've personally found it useful to always have concrete numbers in mind. An algorithm or data structure designed for N will probably be fine for N/10 and 10N, but it will often be inefficient for N/1000 or 1000N.
1 reply →