Comment by inetknght

7 hours ago

> I look at memory profiles of rnomal apps and often think "what is burning that memory".

As a corrolary to this: I look at CPU utilization graphs. Programs are completely idle. "What is burning all that CPU?!"

I remember using a computer with RAM measured in two-digit amounts of MiB. CPU measured in low hundreds of MHz. It felt just as fast -- sometimes faster -- as modern computers. Where is all of that extra RAM being used?! Where is all of that extra performance going?! There's no need for it!

Next time you see someone on HN blithely post "CPU / RAM is cheaper than developer time", it's them. That is the sort of coder who are collectively wasting our CPU and RAM.

  • If you ran a business, would you rather your devs work on feature X that could bring in Y revenue, or spend that same time reducing CPU/RAM/storage utilization by Z% and gives the benefit of ???

    • There is probably some low hanging fruit to be harvested in terms of memory optimizations, and it could be a selling point for the next while as the memory shortage persists

  • Even an editor running under Inferno plus Inferno itself would be lighter than the current editors by large. And that with the VM being weighted on. And Limbo it's a somewhat high level language...

> I remember using a computer with RAM measured in two-digit amounts of MiB

Yes, so do I. It was limited to 800x600x16 color mode or 320x200x256. A significant amount of memory gets consumed by graphical assets, especially in web browsers which tend to keep uncompressed copies of images around so they can blit them into position.

But a lot is wasted, often by routing things through single bottlenecks in the whole system. Antivirus programs. Global locks. Syncing to the filesystem at the wrong granularity. And so on.

Work expands to fill the available time. This applies to CPU time just as it does to project management.

I too wonder that. And it is true on an OS level as well. The only worthwhile change in desktop environments since the early 2000s has been search as you type launchers. Other than that I would happily use something equivalent to Windows XP or (more likely) Linux with KDE 3. It seems everything else since then has mostly been bloat and stylistic design changes. The latter being a waste of time in my opinion.

Of course, some software other than desktop environments have seen important innovation, such as LSPs in IDEs which allows avoiding every IDE implementing support for every language. And SSDs were truly revolutionary in hardware, in making computers feel faster. Modern GPUs can push a lot more advanced graphics as well in games. And so on. My point above was just about your basic desktop environment. Unless you use a tiling window manager (which I tried but never liked) nothing much has happened for a very long time. So just leave it alone please.

  • >The only worthwhile change in desktop environments since the early 2000s has been search as you type launchers.

    Add to that: unicode handling, support for bigger displays, mixed-DPI, networking and device discovery is much less of a faff, sound mixing is better, power management and sleep modes much improved. And some other things I'm forgetting.