Comment by decasia
4 years ago
Yes, I absolutely think profiling and then only optimizing the actual problems is always a sound choice.
I don't check the docs for every library function I use. I'm just saying, it wouldn't hurt if, when you do read the docs for standard library functions, the algorithmic complexity was mentioned in passing.
In principle, that sounds good. But then it can happen that you profiled when N=1000 and it seems fine. Then a few years later (like in GTA), N has grown to 63,000 and it's no longer fine. It seems unlikely the developer will go back and profile it again. Also, I think the original Windows Update algorithm for figuring out which updates you needed to download started out fine, but 20 years later it turns out it's quadratic and now there are tens of thousands of updates, so it becomes almost impossible to install XP SP2 from a CD and have it update itself.
also dont forget the quadratic time desktop icon arrangement: https://news.ycombinator.com/item?id=26152335
Do you have a link for the Windows Update algorithm being quadratic?
I believe this is what the parent was referring to: https://arstechnica.com/information-technology/2013/12/expon...
1 reply →
Luckily, this happens to be one of the places where learning computer science can help!