Comment by WJW
11 hours ago
This essay, like so many others, mistakes the task of "building" software with the task of "writing" software. Anyone in the world can already get cheap, mass-produced software to do almost anything they want their computer to do. Compilers spit out new build of any program on demand within seconds, and you can usually get both source code and pre-compiled copies over the internet. The "industrial process" (as TFA puts it) of production and distribution is already handled perfectly well by CI/CD systems and CDNs.
What software developers actually do is closer to the role of an architect in construction or a design engineer in manufacturing. They design new blueprints for the compilers to churn out. Like any design job, this needs some actual taste and insight into the particular circumstances. That has always been the difficult part of commercial software production and LLMs generally don't help with that.
It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian. That is merely the first and easiest barrier, but after learning the language you are still no Tolstoy.
You're getting caught up on the technical meaning of terms rather than what the author actually wrote.
Theyre explicitly saying that most software will no longer be artisianal - a great literary novel - and instead become industrialized - mass produced paperback garbage books. But also saying that good software, like literature, will continue to exist.
Yes, I read the article. I still think it's incorrect. Most software (especially by usage) is already not artisanal. You get the exact same browser, database server and (whatsapp/signal/telegram/whatever) messenger client as basically everyone else. Those are churned out by the millions from a common blueprint and designed by teams and teams of highly skilled specialists using specialized tooling, not so different from the latest iPhone or car.
As such, the article's point fails right at the start when it tries to make the point that software production is not already industrial. It is. But if you look at actual industrial design processes, their equivalent of "writing the code" is relatively small. Quality assurance, compliance to various legal requirements, balancing different requirements for the product at hand, having endless meetings with customer representatives to figure out requirements in the first place, those are where most of the time goes and those are exactly the places where LLMs are not very good. So the part that is already fast will get faster and the slow part will stay slow. That is not a recipe for revolutionary progress.
Hey! I'm going to passionately defend my choice over a really minor difference. I mean do you see how that app does their hamburger menu?! It makes the app utterly unusable!
Maybe I'm exaggerating here but I've heard things pretty close in "chrome vs Firefox" and "signal vs ..." threads. People are really passionate about tiny details. Or at least they think that's that they're passionate about.
Unfortunately I think what they don't realize is that passion often hinders that revolutionary progress you speak of. It just creates entrenched players and monopolies in domains where it should be near trivial to move (browsers are definitely trivial to jump ship)
I think the author of the post envisions more code authoring automation, more generated code/test/deployment, exponentially more. To the degree what we have now would be "quaint", as he says.
Your point that most software uses the same browsers, databases, tooling and internal libraries is a weakness, a sameness that can be exploited by current AI, to push that automation capability much further. Hell, why even bother with any of the generated code and infrastructure being "human readable" anymore? (Of course, all kinds of reasons that is bad, but just watch that "innovation" get a marketing push and take off. Which would only mean we'd need viewing software to make whatever was generated readable - as if anyone would read to understand hundreds/millions of generated complex anything.)
1 reply →
I guess two things can be true at the same time. And I think AI will likely matter a lot more than detractors think, and nowhere near as much as enthusiasts think.
Perhaps a good analogy is the spreadsheet. It was a complete shift in the way that humans interacted with numbers. From accounting to engineering to home budgets - there are few people who haven't used a spreadsheet to "program" the computer at some point.
It's a fantastic tool, but has limits. It's also fair to say people use (abuse) spreadsheets far beyond those limits. It's a fantastic tool for accounting, but real accounting systems exist for a reason.
Similarly AI will allow lots more people to "program" their computer. But making the programing task go away just exposes limitations in other parts of the "development" process.
To your analogy I don't think AI does mass-produced paperbacks. I think it is the equivalent of writing a novel for yourself. People don't sell spreadsheets, they use them. AI will allow people to write programs for themselves, just like digital cameras turned us all into photographers. But when we need it "done right" we'll still turn to people with honed skills.
> your analogy I don't think AI does mass-produced paperbacks
It's the article's analogy, not mine.
And, are you really saying that people aren't regularly mass-vibing terrible software that others use...? That seems to be a primary use case...
Though, yes, I'm sure it'll become more common for many people to vibe their own software - even if just tiny, temporary, fit-for-purpose things.
2 replies →
What he's missing is that there's always been a market for custom-built software by non-professionals. For instance, spreadsheets. Back in the 1970s engineers and accountants and people like that wrote simple programs for programmable calculators. Today it's Python.
The most radical development in software tools I think, would be more tools for non-professional programmers to program small tools that put their skills on wheels. I did a lot of biz dev around something that encompassed "low code/no code" but a revolution there involves smoothing out 5-10 obstacles with a definite Ashby character that if you fool yourself that you can get away with ignoring the last 2 required requirements you get just another Wix that people will laugh at. For now, AI coding doesn't have that much to offer the non-professional programmer because a person without insight into the structure of programs, project management and a sense of what quality means will go in circles at best.
I think the thinking in the article is completely backwards about the economics. I mean, the point of software is you can write it once and the cost to deploy a billion units is trivial in comparison. Sure, AI slop can put the "crap" in "app" but if you have any sense you don't go cruising the app store for trash but find out about best-of-breed products or products that are the thin edge of a long wedge (like the McDonald's app which is valuable because it has all the stores baacking it)
This was already true before LLMs. "Artisinal software" was never the norm. The tsunami of crap just got a bit bigger.
Unlike clothing, software always scaled. So, it's a bit wrongheaded to assume that the new economics would be more like the economics of clothing after mass production. An "artisanal" dress still only fits one person. "Artisanal" software has always served anywhere between zero people and millions.
LLMs are not the spinning jenny. They are not an industrial revolution, even if the stock market valuations assume that they are.
Agreed, software was always kind of mediocre. This is expected given the massive first mover advantage effect. Quality is irrelevant when speed to market is everything.
1 reply →
> It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian.
The article is very clearly not saying anything like that. It's saying the greatest barrier to making throwaway comments on Russian social media is not speaking Russian.
Roughly the entire article is about LLMs making it much cheaper to make low quality software. It's not about masterpieces.
And I think it's generally true of all forms of generative AI, what these things excel at the most is producing things that weren't valuable enough to produce before. Throwaway scripts for some task you'd just have done manually before is a really positive example that probably many here are familiar with.
But making stuff that wasn't worth making before isn't necessarily good! In some cases it is, but it really sucks if we have garbage blog posts and readmes and PRs flooding our communication channels because it's suddenly cheaper to produce than whatever minimal value someone gets out of hoisting it on us.
I also wonder about the process.
I've worked for a lot of people involved in the process happily request their software get turned into spaghetti. Often because some business process "can't" be changed, but mostly because decision makers do not know / understand what they're asking in a larger scheme of things.
A good engineer can help mitigate that, but only so much. So you end up with industrial sludge to some extent anyway if people in the process are not thoughtful.
> It's like thinking the greatest barrier to producing the next great Russian literary novel is not speaking Russian. That is merely the first and easiest barrier, but after learning the language you are still no Tolstoy.
And what do you feel is the role of universities? Certainly not just to learn the language right? I'm going through a computer engineering degree and sometimes I feel completely lost with an urge to give up on everything, even though I am still interested in technology.
One can go to school to learn the literary arts. Many do. A lot of authors do not.
A lot of engineers and programmers did not go to school.
I have for a long time been saying software is a new form of literacy - and I really need to finish writing the book !