Comment by adrian17
2 days ago
> On this benchmark, the compiler is almost twice as fast than it was three years ago.
I think the cause of the public perception issue could be the variant of Wirth's law: the size of an average codebase (and its dependencies) might be growing faster than the compiler's improvements in compiling it?
Yeah definitely when you include dependencies. Also I've noticed that when your dependency tree gets above a certain size you end up pulling in every alternative crate for a certain task, because e.g. one of your dependencies uses miniz_oxide and another uses zlib-rs (or whatever).
On the other hand the compile to for most dependencies doesn't matter hugely because they are easy to do in parallel. It's always the last few crates and linking that take half the time.