Comment by ranger207

2 years ago

The only software that can go without updates is software that gets it right the first time If you're building software for yourself, this is relatively easy. Your tastes probably won't change that much even after a decade. You can probably ignore minor problems like using an O(n^2) function where there exists an O(n) because n is small. If you're writing software that other people will use, then that's where the problems come in. Other people don't have the same requirements as you, and may have a large enough N that the O(n) function makes it worth it, for example.

But regardless of if you're writing for yourself or someone else, sometimes you just can't foresee problems. Maybe it crashes processing files larger than a gig, but because you've only ever used files <100KB it's never mattered to you. Then you go in to fix the crash and it turns out you're going to have to rewrite half the thing.

This is, I think, the biggest argument against the idea that software that doesn't change is inherently better than software that changes frequently[0]: it may be that unchanging software was perfect from the first line, or it may be that there's terrors lurking in the deep, and a priori it can be difficult to tell which a particular project is.

[0] This is not to say that rapidly updating software is inherently better than slowly updating software either. There's many factors other than just update speed

I don't think the idea is that software should never change. If requirements change then software obviously has to change as well.

But over the course of 10 years a lot of things can change that have nothing to do with changing requirements.

Open source projects are abandoned or change direction. Commercial software gets discontinued. Companies get acquired. App Store / Play Store rules change. APIs go away or change pricing in ways that render projects economically unviable. Toolchains, frameworks and programming languages, paradigms and best practices change.

I think the point is that you don't want external changes that are unrelated to your requirements to force change on you. It's a good principle but as always there are trade-offs.

There is stable and then there is obsolete. The difference is often security.

And what if an important new requirement is easy to meet, but only if you bump a vendored library by seven major versions causing all sorts of unrelated breakage?

What if there aren't enough people left who are familiar with your frozen-in-time toolset and nobody wants to learn it any longer?

I think careful and even conservative selection of dependencies is a good idea, but not keeping up with changes in that hopefully small set of dependencies is one step too far for me.