Comment by toprerules
16 days ago
After working with the latest models I think these "it's just another tool" or "another layer of abstraction" or "I'm just building at a different level" kind of arguments are wishful thinking. You're not going to be a designer writing blueprints for a series of workers to execute on, you're barely going to be a product manager translating business requirements into a technical specification before AI closes that gap as well. I'm very convinced non-technical people will be able to use these tools, because what I'm seeing is that all of the skills that my training and years of experience have helped me hone are now implemented by these tools to the level that I know most businesses would be satisfied by.
The irony is that I haven't seen AI have nearly as large of an impact anywhere else. We truly have automated ourselves out of work, people are just catching up with that fact and the people that just wanted to make money from software can now finally stop pretending that "passion" for "the craft" was every really part of their motivating calculus.
If all you (not you specifically, more of a royal “you” or “we”) are is a collection of skills centered around putting code into an editor and opening pull requests as fast as possible, then sure, you might be cooked.
But if your job depends on taste, design, intuition, sociability, judgement, coaching, inspiring, explaining, or empathy in the context of using technology to solve human problems, you’ll be fine. The premium for these skills is going _way_ up.
The question isn't whether businesses will have 0 human element to them, the question is does AI offer a big enough gap that technical skills are still required such that technical roles are still hired for. Someone in product can have all of those skills without a computer science degree, with no design experience, and AI will do the technical work at the level of design, implementation, and maintenance. What I am seeing with the new models isn't just writing code, it's taking fundamental problems as input and design wholistic software solutions as output - and the quality is there.
I am only seeing that if the person writing the prompts knows what a quality solution looks like at a technical level and is reviewing the output as they go. Otherwise you end up with an absolute mess that may work at least for "happy path" cases but completely breaks down as the product needs change. I've described a case of this in some detail in another comment.
1 reply →
It turns out that corporations value these things right up until a cheaper almost as good alternative is available.
The writing is on the wall for all white collar work. Not this year or next, but it's coming.
If all white collar work goes, we’re going to have to completely restructure the economy or collapse completely.
Being a plumber won’t save you when half the work force is unemployed.
Ah the age old 'but humans have heart, and no machine can replicate that' argument. Good luck!
The process of delivering useful, working software for nontrivial problems cannot be reduced to simply emitting machine instructions as text.
1 reply →
When your title is software engineer, good luck convincing the layoff machine about your taste, design, intuition, sociability, judgement, coaching, inspiring, explaining, or empathy in the context of using technology to solve human problems.
> The irony is that I haven't seen AI have nearly as large of an impact anywhere else.
We are in this pickle because programmers are good at making tools that help programmers. Programming is the tip of the spear, as far as AI's impact goes, but there's more to come.
Why pay an expensive architect to design your new office building, when AI will do it for peanuts? Why pay an expensive lawyer to review your contract? Why pay a doctor, etc.
Short term, doing for lawyers, architects, civil engineers, doctors, etc what Claude Code has done for programmers is a winning business strategy. Long term, gaining expertise in any field of intellectual labor is setting yourself up to be replaced.
> Why pay an expensive architect to design your new office building, when AI will do it for peanuts? Why pay an expensive lawyer to review your contract? Why pay a doctor, etc.
All of those jobs are mandated by law to done by accredited and liable humans.
> All of those jobs are mandated by law to done by accredited and liable humans.
Good point. The jobs I listed will be protected for a little while due to statutory limitations. At first, firms will have one AI-augmented lawyer take on the work of a dozen lawyers. Of course his salary won't increase, and the others will be fired. Eventually, he'll just be rubber-stamping the AI's results, purely for the sake of compliance. Then the ruling class will petition the legislature to change the law in the name of "efficiency," and that will be the end of that.
Meanwhile, programmers have no such protection. Nor do customer service agents, secretaries, publishers, copywriters, banker, office managers. There is no safety net.
2 replies →
> Why pay an expensive architect to design your new office building, when AI will do it for peanuts?
Will it? AI is getting good at some parts of programming because of RLVR. You can test architectural designs automatically to some extent but not entirely, because people tend to want unique buildings that stand out (if it weren't the case architects would have already become a niche profession due to everyone using prefabs all the time). At some point an architectural design has to be built and you can't currently simulate real building sites at high speed inside a datacenter. This use case feels marginal.
There's going to be a lot of cases like this. The safe jobs are ones where there's little training data available online, the job has a large component of unarticulated experience or intuition, and where you can't verify purely in software whether the work artifact is correct or not.
> people tend to want unique buildings that stand out
Just tell the LLM that you want a unique design. I've found LLMs to respond well to requests for "originality," at least in poetry, prose, and coding. No reason that can't do that in architecture as well.
> At some point an architectural design has to be built and you can't currently simulate real building sites at high speed inside a datacenter.
First of all, you can simulate a building site, or any physical environment. We've been doing that for years, even in games. AI companies are working towards a "world model" for precisely that reason. Second of all, even without a physical simulation, the laws of physics are deterministic and easy for an LLM to understand.
> The safe jobs are ones where there's little training data available online,
These cases are "safe" only in relative terms. Lack of easily-available training data is friction but not insurmountable. AI companies have bet big and they have a strong incentive to find and use appropriate training data.
> what I'm seeing is that all of the skills that my training and years of experience have helped me hone are now implemented by these tools to the level that I know most businesses would be satisfied by.
So when things break or they have to make changes, and the AI gets lost down a rabbit hole, who is held accountable?
The answer is the AI. It's already handling complex issues and debugging solely by gathering its own context, doing major refactors successfully, and doing feature design work. The people that will be held responsible will be the product owners, but it won't be for bugs, it will be for business impact.
My point is that SWEs are living on a prayer that AI will be perched on a knifes edge where there is still be some amount of technical work to make our profession sustainable and from what I'm seeing that's not going to be the case. It won't happen overnight, but I doubt my kids will ever even think about a computer science degree or doing what I did for work.
I work in the green energy industry and we see it a lot now. Two years ago the business would've had to either buy a bunch of bad "standard" systems which didn't really fit, or wait for their challengs to be prioritised enough for some of our programmers. Today 80-90% of the software which is produced in our organisation isn't even seen by our programmers. It's build by LLM's in the hands of various technically inclined employees who make it work. Sometimes some of it scales up a bit that our programmers get involved, but for the most part, the quality matters very little. Sure I could write software that does the same faster and with much less compute, but when the compute is $5 a year I'd have to write it rather fast to make up for the cost of my time.
I make it sound like I agree with you, and I do to an extend. Hell, I'd want my kids to be plumbers or similar where I would've wanted them to go to an university a couple of years ago. With that said. I still haven't seen anything from AI's to convince me that you don't need computer science. To put it bluntly, you don't need software engineering to write software, until you do. A lot of the AI produced software doesn't scale, and none of our agents have been remotely capable of making quality and secure code even in the hands of experienced programmers. We've not seen any form of changes over the past two years either.
Of course this doesn't mean you're wrong either. Because we're going to need a lot less programmers regardless. We need the people who know how computers work, but in my country that is a fraction of the total IT worker pool available. In many CS educations they're not even taught how a CPU or memory functions. They are instead taught design patterns, OOP and clean architecture. Which are great when humans are maintaining code, but even small abstractions will cause l1-3 cache failures. Which doesn't matter, until it does.
And what happens when the AI can't figure it out?
11 replies →
> After working with the latest models I think these "it's just another tool" or "another layer of abstraction" or "I'm just building at a different level" kind of arguments are wishful thinking. You're not going to be a designer writing blueprints for a series of workers to execute on, you're barely going to be a product manager translating business requirements into a technical specification before AI closes that gap as well
I think it's doubtful you'll be even that; certainly not with the salary and status that normally entails.
> I'm very convinced non-technical people will be able to use these tools
This suggests that the skill ceiling of "Vibe Coding" is actually quite low, calling into question the sense of urgency with which certain AI influnecers present it, as if it were a skill that you need to invest major time & effort to hone now (with their help, of course), lest you get left behind and have to "catch up" later. Yet one could easily see it being akin to Googling, which was also a skill (when Google was usable), one that did indeed increase your efficiency and employable, but with a low ceiling, such that "Googler" was never a job by itself, the way some suggest "prompt engineer" will be. The Google analogy is apt, in that you're typing keywords into a blackbox until it spits out what you want; quite akin to how people describe "prompt engineering."
Also the Vibe Coding skillset--a bag of tricks and book of incantations you're told can cajole the model--has a high churn rate. Once, narrow context windows meant restarting a session de novo was advisable if you hit a roadblock, but now it's usually the opposite.
If this all true, then wouldn't the correct takeaway, rather than embracing and mastering "Vibe Coding" (as influencers suggest), be to "pivot" to a new career, like welding?
> The irony is that I haven't seen AI have nearly as large of an impact anywhere else. We truly have automated ourselves out of work, people are just catching up with that fact
What's funny is artists immediately, correctly perceived the threat of AI. You didn't see cope about it being "just another tool, like Photoshop."
Gen AI for art was different because it would just output a final image with basically 0 control for the artist. It's like if AI programming would output a binary instead of source code.
> translating business requirements into a technical specification
a.k.a. Being a programmer.
> The irony is that I haven't seen AI have nearly as large of an impact anywhere else.
What lol. Translation? Graphic design?
Writing? Education?
[dead]
[dead]