Comment by vincent-manis
12 hours ago
I don't do web stuff at all, but I really enjoyed this article. I am convinced that software engineers (not to mention others) have thrown the baby out with the bathwater in our brave new world of 32GB memories and fibre-optics. By all means the generous hardware capabilities let us do amazing things, like have a video library, or run massive climate computations, but mostly those resources are piddled away in giant libraries that provide little or no actual functional value.
I don't really pine for the days of the PDP-8, when programmers had to make sure that almost every routine took fewer than 128 words, or the days of System/360, when you had to decide whether the fastest way to clear a register was to subtract it from itself or exclusive-or it with itself. We wasted a lot of time trying to get around stringent limitations of the technology just to do anything at all.
I just looked at the Activity Monitor on my Macbook. Emacs is using 115MB, Thunderbird is at 900MB, Chrome is at something like 2GB (I lost track of all the Renderer processes), and a Freecell game is using 164MB. Freecell, which ran just fine on Windows 95 in 8MB!
I'm quite happy with a video game taking a few gigabytes of memory, with all the art and sound assets it wants to keep loaded. But I really wonder whether we've lost something by not making more of an effort to use resources more frugally.
An addendum...Back in the 1960s, IBM didn't grok time-sharing. When MIT/Bell Labs looked for a machine with address translation, IBM wasn't interested, so GE got the contract. IBM suddenly realized that they had lost an opportunity, and developed their address translation, which ended up in the IBM 360/67. They also announced an operating system, TSS/360, for this machine. IBM practice was to define memory constraints for their software. So Assembler F would run on a 64K machine, Fortran G on a 128K machine, and so on. The TSS engineers asked how much memory their components were given. They were told “It's virtual memory, use as much as you need.” When the first beta of TSS/360 appeared, an attempt to log in produced the message LOGON IN PROGRESS...for 20 minutes. Eventually, IBM made TSS/360 usable, but by then it was too late. 360/67s ended up running VM/CMS, or 3rd party systems: I had many happy years using the Michigan Terminal System.
Remember, there's a gigabit pathway between server and browser, so use as much of the bandwidth as you need.
At my deathbed, I’m not sure if I’ll be able to forgive our industry for that. I grew up in the 3rd world where resources where extremely expensive, so my early career was all about doing the most with the resources I had. It was a skill that I had honed so well and now it feels useless and unappreciated. With higher interest rates we see a small degree of it again, but I’m doubtful that hiring managers without that experience will be able to identify it on the wild to pick me.
> I really wonder whether we've lost something by not making more of an effort to use resources more frugally
I'll bite. What do you think we've lost? What would the benefit be of using resources more frugally?
Disclosure: I'm an embedded systems programmer. I frequently find myself in the position where I have to be very careful with my usage of CPU cycles and memory resources. I still think we'd all be better off with infinitely fast, infinitely resourced computers. IMO, austerity offers no benefit.
Quick answer: is our software any more usable, any more reliable, than it was 50 years ago? The more code you write, the more dependencies you require, the more opportunity for bugs and design errors to creep in. I get the impression that many software projects have enough fixes and kludges slathered on them to make them work nowadays.
(Remember Bill Atkinson's famous response, quoted here to how much code he'd written that week: -3000. He had reworked Quickdraw so that it was faster and better, with a net loss of 3000 lines of code.) Of course the classic Mac had its own constraints.
> is our software any more usable ... than it was 50 years ago?
Yes, by several orders of magnitude. I couldn't enter or display Japanese or on my Atari 800 nor Apple 2 nor C64 (sorry, only 45 years ago). I couldn't display 200+ large 24bit images with ease (heres 100: https://www.flickr.com/groups/worldofarchitecture/pool/). Or try this: https://scrolldit.com/
I couldn't play 16 videos simultaneously while downloading stuff in the background and and playing a game. I could go on and on but my computer today is vastly more usable than any of my computers 40 years ago that could only effectively run one app at a time and I had to run QEMM and edit my config.sys and autoexec.bat to try to optimized my EMS and XMS memory cards.
I love that I can display a video as simple as
It's much more capable, that's the main thing. Reliability and usability tend not to be valued in the market much, but being able to do more things is.
I'm a scientist, working adjacent to a team of engineers.
My sense is that there's a "conceptual" austerity, due to the limited ability of the human mind to understand complex systems. The programmers are satisfied that the product is documented, because it passes tests, and "the source code expresses what the program does." But nobody can explain it to me, in situations where I have to be involved because something has gone wrong.
The system has surpassed some threshold of conceptual austerity when the majority of the devs have concluded that the only hope is to scrap it and start over, but they can't, because they don't know what it's supposed to do, and can't find out.
On the other hand, the infinite computer would take care of this for us too. We're faced with semi-infinite computers at the present time, that can be filled to the brim with stuff that they can't themselves understand or manage. But all real things are finite.
> austerity offers no benefit
Data centers use a lot of electricity. Even a 10% reduction would cause huge impact world wide
> I still think we'd all be better off with infinitely fast, infinitely resourced computers. IMO, austerity offers no benefit.
Well, sure. But these computers don't exist, so that doesn't really matter.
The main reason I bought a new laptop last year is because the frontend build process needed about 5G of RAM and ~4 minutes, which was a real productivity-killed. I'm not a frontend dev, I inherited all of this, and wasn't so easy to fix for both technical and organisational reasons.
Excessive austerity offers no benefit, I agree, but some projects are out of control in their resource usage, and impose a real burden with it.
Partial answer: Understanding of and peace with ourselves as humans. Human skill and discipline are long-lasting challenges that satisfy. Those who have not experienced the process of improving the self over years of practice are prone to unease and depression.
> But I really wonder whether we've lost something by not making more of an effort to use resources more frugally.
On the desktop we definitely lost responsiveness. Many webpages, even on the speediest, fatest, computer of them all are dog slow compared to what they should be.
Now some pages are fine but the amount of pigs out there is just plain insane.
I like my desktop to be lean and ultra low latency: I'll have tens of windows (including several browsers) and it's super snappy. Switching from one virtual workspace to another is too quick to see what's happening (it take milliseconds and I do it with a keyboard shortcut: reaching for the mouse is a waste of time).
I know what it means to have a system that feels like it's responding instantly as I do have such a system... Except when I browse the web!
And it's really only some (well, a lot of) sites: people who know what they're doing still come up with amazingly fast websites. But it's the turds: those shipping every package under the sun and calling thousands of micro-services, wasting all the memory available because they know jack shit about computer science, that make the Web a painful experience.
And although I use LLMs daily, I see a big overlap between those having the mindset required to produce such turds and those thinking LLMs are already perfectly fine today to replace devs, so I'm not exactly thrilled about the immediate future of the web.
P.S: before playing the apologists for such turd-sites, remember you're commenting on a very lean website. So there's that.