Comment by godelski
2 months ago
>> You especially need to consider typical computer usage involves using more than one application at a time. There's a tragedy of the commons issue
These resources include:
- disk/ssd/long term memory
- RAM/System memory
- Cache
BUT ALSO
- Registers
- CPU Cores
- Busses/Lanes/Bandwith
- Locks
- Network
My point is that I/O only dominates when you're actually acting efficiently. This is dominating in the case of measuring a single operating program.
You're forgetting that when multiple programs are running that there's a lot more going on. There's a lot more communication going on too. The caches are super tiny and in high competition. To handle interlacing all those instructions. Even a program's niceness can dramatically change total performance. This is especially true when we're talking about unoptimized programs because all those little things that the OS has to manage pile up.
Get out your computer architecture book and do a skim to refresh. Even Knuth's Book (s)[1] discuss much of this because to write good programs you gotta understand the environment they're running in. Otherwise I'd be like trying to build a car but not knowing if you're building it for the city, Antarctica, or even the moon. The environment is critical to the assumptions you can make.
Actually had a good real world example today. My TV had an update a month ago and since then most of the apps don't actually work. Netflix plays with like 3 pixels, Hulu just hangs. Luckily I use the TV as a monitor 99% of the time. But I think we all know how slow a lot of these systems get. Things getting slower over time... it works fine when things ship but like I suggested, issues build over time. One app here, another there, and before you know it you're buying a new TV, computer, phone, whatever