← Back to context

Comment by maeln

21 hours ago

More than CPU speed, I think the increase in storage and RAM is to blame for the slow decay in latency. When you have only a few Kb/Mb of RAM and storage, you can't really afford to add much more to the software than what is the core feature. Your binary need to be small, which lead to faster loading in RAM, and do less, which means less things to run before the actual program.

When size is not an issue, it's harder to say no when the business demand for a telemetry system, an auto-update system, a crash handler with automatic report, and a bunch of features, a lot of which needs to be initialized at the start of the program, introducing significant latency at startup.

It's also complexity - added more than necessary, and @ a faster pace than hardware can keep up with.

Take font rendering: in early machines, fonts were small bitmaps (often 8x8 pixels, 1 bit/pixel), hardcoded in ROM. As screen resolutions grew (and varied between devices), OSes stored fonts in different sizes. Later: scalable fonts, chosen from a selection of styles / font families, rendered to sub-pixel accuracy, sub-pixel configuration adjustable to match hw construction of the display panel.

Yeah this is very flexible & can produce good looking fonts (if set up correctly). Which scale nicely when zooming in or out.

But it also makes rendering each single character a lot more complex. And thus eats a lot more cpu, RAM & storage than 8x8 fixed size, 1bpp font.

Or the must-insert-network-request-everywhere bs. No, I don't need search engine to start searching & provide suggestions after I've typed 1 character & didn't hit "search" yet.

There are many examples like the above, I won't elaborate.

Some of that complexity is necessary. Some of it isn't, but lightweight & very useful. But much of it is just a pile of unneeded crap of dubious usefulness (if any).

Imho, software development really should return to 1st principles. Start with a minimum viable product, that only has the absolute necessary functionality relevant to end-users. Don't even bother to include anything other than the absolute minimum. Optimise the heck out of that, and presto: v1.0 is done. Go from there.

  • > But it also makes rendering each single character a lot more complex.

    Not millions of times more complex.

    Except for some outliers that mess up everything (like anything from Microsoft), almost all of the increased latency between keypress and character rendering we see on modern computers comes from optimizing for modularity and generalization instead of specialized code for handling the keyboard.

    Not even our hardware reacts fast enough to give you the latency computers had in the 90s.

    • Im not really sure what are you talking about ;) HW become much much much faster. Mostly in speed of computing, but latency also dropped nicely. The letency bloat you see its 99% of software (OS). I still run Win2003 on modern desktop, and it flies! Really, booting/shutdown is quick. Im on spinning rust, so first start of webbrowser is slowish a bit, but once cached, its like 200ms-500ms depending on version (more modern = slower).

      2 replies →

  • I'm not sure your font rendering is very good example here. Windows has used vector fonts since 90s and ClearType since Windows XP. That is nearly 25 years ago. And it wasn't really much of a performance issue even back then.

    • Correct. Modern font rendering likely falls into that "more complex, but lightweight / useful" category.

      My point was it's much more complex even though it does essentially the same thing (output character to screen).

      >> There are many examples like the above

      Death by a 1000 cuts! Probably much worse offenders out there that deserve being attacked.