← Back to context

Comment by HeyLaughingBoy

20 hours ago

> I really wonder whether we've lost something by not making more of an effort to use resources more frugally

I'll bite. What do you think we've lost? What would the benefit be of using resources more frugally?

Disclosure: I'm an embedded systems programmer. I frequently find myself in the position where I have to be very careful with my usage of CPU cycles and memory resources. I still think we'd all be better off with infinitely fast, infinitely resourced computers. IMO, austerity offers no benefit.

In my embedded career I always assumed the value of a SW engineer was to cut costs, regardless of whether it was a few cents (and of course add to quality and reliability). It seems in embedded, it’s one of the only SW fields left where software can affect the cost of the system. By choosing a processor in a family with the least amount of memory and speed, you can achieve a lower system cost. In this world, the ability to write fast code in a minimal footprint matters. You still have to wrestle with development time/cost versus HW cost tradeoffs vs sales, but I think it still remains true that the coding practices we saw in 60s-80s are still valued in this space. My impression is these skills these days are being lost/ignored for the most part. When a HW limit is reached, the only way forward for quicker SW solution is optimization.

Quick answer: is our software any more usable, any more reliable, than it was 50 years ago? The more code you write, the more dependencies you require, the more opportunity for bugs and design errors to creep in. I get the impression that many software projects have enough fixes and kludges slathered on them to make them work nowadays.

(Remember Bill Atkinson's famous response, quoted here to how much code he'd written that week: -3000. He had reworked Quickdraw so that it was faster and better, with a net loss of 3000 lines of code.) Of course the classic Mac had its own constraints.

  • > is our software any more usable ... than it was 50 years ago?

    Yes, by several orders of magnitude. I couldn't enter or display Japanese or on my Atari 800 nor Apple 2 nor C64 (sorry, only 45 years ago). I couldn't display 200+ large 24bit images with ease (heres 100: https://www.flickr.com/groups/worldofarchitecture/pool/). Or try this: https://scrolldit.com/

    I couldn't play 16 videos simultaneously while downloading stuff in the background and and playing a game. I could go on and on but my computer today is vastly more usable than any of my computers 40 years ago that could only effectively run one app at a time and I had to run QEMM and edit my config.sys and autoexec.bat to try to optimized my EMS and XMS memory cards.

    I love that I can display a video as simple as

        <video src="url-to-video"></video>

  • It's much more capable, that's the main thing. Reliability and usability tend not to be valued in the market much, but being able to do more things is.

    • I’m not sure that is true, if anything I would say most of the software I use today has been dumbed down to cater for a mainstream audience.

I'm a scientist, working adjacent to a team of engineers.

My sense is that there's a "conceptual" austerity, due to the limited ability of the human mind to understand complex systems. The programmers are satisfied that the product is documented, because it passes tests, and "the source code expresses what the program does." But nobody can explain it to me, in situations where I have to be involved because something has gone wrong.

The system has surpassed some threshold of conceptual austerity when the majority of the devs have concluded that the only hope is to scrap it and start over, but they can't, because they don't know what it's supposed to do, and can't find out.

On the other hand, the infinite computer would take care of this for us too. We're faced with semi-infinite computers at the present time, that can be filled to the brim with stuff that they can't themselves understand or manage. But all real things are finite.

> I still think we'd all be better off with infinitely fast, infinitely resourced computers. IMO, austerity offers no benefit.

Well, sure. But these computers don't exist, so that doesn't really matter.

The main reason I bought a new laptop last year is because the frontend build process needed about 5G of RAM and ~4 minutes, which was a real productivity-killed. I'm not a frontend dev, I inherited all of this, and wasn't so easy to fix for both technical and organisational reasons.

Excessive austerity offers no benefit, I agree, but some projects are out of control in their resource usage, and impose a real burden with it.

> austerity offers no benefit

Data centers use a lot of electricity. Even a 10% reduction would cause huge impact world wide

Partial answer: Understanding of and peace with ourselves as humans. Human skill and discipline are long-lasting challenges that satisfy. Those who have not experienced the process of improving the self over years of practice are prone to unease and depression.