Comment by kittbuilds

6 days ago

There's something to this. The 200-400MHz era was roughly where hardware capability and software ambition were in balance — the OS did what you asked, no more.

What killed that balance wasn't raw speed, it was cheap RAM. Once you could throw gigabytes at a problem, the incentive to write tight code disappeared. Electron exists because memory is effectively free. An alternate timeline where CPUs got efficient but RAM stayed expensive would be fascinating — you'd probably see something like Plan 9's philosophy win out, with tiny focused processes communicating over clean interfaces instead of monolithic apps loading entire browser engines to show a chat window.

The irony is that embedded and mobile development partially lives in that world. The best iOS and Android apps feel exactly like your description — refined, responsive, deliberate. The constraint forces good design.

> What killed that balance wasn't raw speed, it was cheap RAM. Once you could throw gigabytes at a problem, the incentive to write tight code disappeared. Electron exists because memory is effectively free.

I dunno if it was cheap RAM or just developer convenience. In one of my recent comments on HN (https://news.ycombinator.com/item?id=46986999) I pointed out the performance difference in my 2001 desktop between a `ls` program written in Java at the time and the one that came with the distro.

Had processor speeds not increased at that time, Java would have been relegated to history, along with a lot of other languages that became mainstream and popular (Ruby, C#, Python)[1]. There was simply no way that companies would continue spending 6 - 8 times more on hardware for a specific workload.

C++ would have been the enterprise language solution (a new sort of hell!) and languages like Go (Native code with a GC) would have been created sooner.

In 1998-2005, computer speeds were increasing so fast there was no incentive to develop new languages. All you had to do was wait a few months for a program to run faster!

What we did was trade-off efficiency for developer velocity, and it was a good trade at the time. Since around 2010 performance increases have been dropping, and when faced with stagnant increases in hardware performance, new languages were created to address that (Rust, Zig, Go, Nim, etc).

-------------------------------

[1] It took two decades of constant work for those high-dev-velocity languages to reach some sort of acceptable performance. Some of them are still orders of magnitude slower.

  • > Had processor speeds not increased at that time, Java would have been relegated to history, along with a lot of other languages that became mainstream and popular (Ruby, C#, Python)[1].

    I'd go look at the start date for all these languages. Except for C#, which was a direct response to the Sun lawsuit, all these languages spawned in the early 90s.

    Had processor speed and memory advanced slower, I don't think you see these languages go away, I see they just end up being used for different things or in different ways.

    JavaOS, in particular, probably would have had more success. Seeing an entire OS written in and for a language with a garbage collector to make sure memory isn't wasted would have been much more appealing.

    • > I'd go look at the start date for all these languages. Except for C#, which was a direct response to the Sun lawsuit, all these languages spawned in the early 90s.

      I don't understand your point here - I did not say those languages came only after 2000, I said they would have been relegated to history if they didn't become usable due to hardware increases.

      Remember that Java was not designed as a enterprise/server language. Sun pivoted when it failed at its original task (set top boxes). It was only able to pivot due to hardware performance increases.

      2 replies →

  • As you say, the trade-off is developer productivity vs resources.

    If resources are limited, that changes the calculus. But it can still make sense to spend a lot on hardware instead of development.

Lots of good practices! I remember how aggressively iPhoneOS would kill your application when you got close to being out of physical memory, or how you had to quickly serialize state when the user switched apps (no background execution, after all!) And, or better or for worse, it was native code because you couldn’t and still can’t get a “good enough” JITing language.