Comment by Mawr
2 days ago
And your point is completely wrong. It makes no sense for a language to by default optimize for the lowest possible binary size of a "hello world"-sized program. Nobody's in the business of shipping "hello world" to binary-size-sensitive customers.
Non-toy programs tend to be big and the size of their code will dwarf whatever static overhead there is, so your argument does not scale.
Even then, binary size is a low priority item for almost all use cases.
But then even if you do care about it, guess what, every low level language, Rust, C, whatever, will let you get close to the lowest size possible if you put in the effort.
So no, on no level does your argument make sense with any of the examples you've given.
Did you notice how nobody in this thread has argued that rust should optimize for binary size or that rust is somehow wrong for having made the trade-offs that it did?
People who have only worked with languages that produce large executables are likely to believe that 300k or so is about as small as an executable can possibly be. And if they believe that, then of course they'll also believe that a serious program must therefore be many megabytes large. If you don't have a good grasp of how capable modern computers are then your estimates will be wrong by orders of magnitude.
There are countless of examples -- you don't have to go that far back in history to find them -- where dozens of engineers worked on a complicated program for years that compiled down to maybe 500kb.
And again, the point still isn't that we should write software today like we had to in the 80s or 90s.
However, programmers should be aware how many orders of magnitude slower/bigger their program is relative to a thoroughly optimized version. A program can be small compared to the size of your disk and at the same time 1000x larger than it needs to be.