Comment by drob518
21 hours ago
Startup time has always been a bit of a sketchy metric as modern OSs and languages do a lot of processing on application launch. Some scan for viruses. On Macs you have checks for x86 vs Apple silicon and loading of Rosetta if required. Managed runtime environments have various JITs that get invoked. And apps are now huge webs of dependencies, so lots of dynamically linked code being loaded. A better metric is their performance once everything is in memory. That said, I still think we’re doing poorly at that metric as well. As resources have ballooned over the last decade, we’ve become lazy and we just don’t care about writing tight code.
Why not both?
The applications are launched at startup time because they have runtime startup slowness. The applications have startup slowness because of the JIT runtime/deps/.dll.
At the end of the day, end users pay for the cost of developer convenience (JIT and deps most of the time, even thought there are some case where dynamic linking is alright) because they don't do native apps.
Offloading everything at startup is a symptom IMO.
Replying to your specific point about virus scans. For some (naive) reason, I expect them to run a single time against a binary app that is never changed. So in theory it shouldn't even be a problem, but the reality says otherwise.
> Replying to your specific point about virus scans. For some (naive) reason, I expect them to run a single time against a binary app that is never changed.
Playing devil's advocate: the executable might not have changed, but the database of known virus signatures changes daily or more often. Either every single executable would have to be re-scanned every time the database updates, or the executable has to be lazily scanned on load.