← Back to context

Comment by mustache_kimono

1 day ago

> .. [I]t is not at all about performance, but performance is the canary in the coalmine: it is a direct translation of the essential vs accidental complexity problem.

This is all nice color on my commentary, but it fails to address the point of my two parent comments: programming is an economic activity. Sometimes a putatively more complex solution is the "right" solution for someone else, because it is easier to understand and implement, or fits within an existing workflow (it is more coherent and consistent).

Yes, if the performance delta is an order of magnitude, then yes, perhaps that is a problems for such software, but then again, maybe it isn't, because economics matter. Lots of people use 10+x slower languages because for loads of technical reasons, but also economic ones.

> In other words: performance is an easy objective metric for the complexity that lies behind (an otherwise opaque) piece of software.

Then presumably so is performance per dollar? Your argument can make sense, where the cost of a redesign is low (in cost of programmer education and experience and ultimately work), and performance benefits are high (10ms faster nets us 10x more dollars). That is -- Blow, et al/you, need to show us where these, "easy", if you will, 10x gains are.

Again -- I agree performance problems are real problems, and data oriented design is one way to reason about those problems, but Blow's marketing exercise/catastrophizing (see "Preventing the Collapse of Civilization") hasn't solved any problems, and is barely an argument without an analysis of what such incremental improvements cost.

> This is all nice color on my commentary, but it fails to address the point of my two parent comments: programming is an economic activity

I've mentioned the economics multiple times now, while you're still hung up on performance, I'm not sure why. Again, performance is an indicator of a perceived deeper underlying problem. The underlying problem is not performance, though that's the surface level gripe that is mentioned. There is no part of the argument that advocates you should redesign a specific piece of software to be faster. Rather, the argument is that our collective ability to make good software is deteriorating.

The underlying problem is nebulous and hard to catch and prove because it is hard to reason objectively about a real program in relation to hypothetical other programs that could compete with it. This makes the Muratori/Blow argument similarly nebulous and their (intentional or not) judgmental attitude does not help in the communication. I am aware that this argument is not iron clad or even clear or that the judgmental attitude is in any way warranted.

So, why does it even make sense to talk about this then? Because if there is an alternate universe where we can actually solve the same problems with vastly simpler logical structures, we should strive to make that reality precisely because of the economics, because simpler logical structures beat the pants off complexity in terms of predictability, investment, ROI, etc.

So to summarize, this is the argument (as I perceive it):

1. lots of software is slowing down over time, e.g: same problem is solved with more resources

2. More resources means not just waiting for stuff to be done, but likely also more complexity (resources are spent doing something, hence there is more to be done, hence more complexity).

3. If the same problems are solved by involving increasingly more complex software over time, there is a likelihood that we are writing software (even new software) in a more complex way than necessary, and that its getting worse over time.

4. We should figure out if that observation is true, and what we can do about it, before the cost of building software (economics) becomes prohibitive. (e.g dramatized as the collapse of civilization)

A lot of assumptions are made in 1 and 2.