Comment by the_af
1 day ago
Arguing about programming is not the point, in my opinion.
When AI becomes able to do most non-programming tasks too, say design or solving open-ended problems (yeah, except in trivial cases it cannot -- for now) we can have this conversation again...
I think saying "well, programming is not important, what matters is $THING" is a coping mechanism. Eventually AI will do $THING acceptably enough for the bean counters to push for more layoffs.
When AI can do the software engineering tasks that require expertise outside of coding like system design, scoping problems, cross-team/domain work, etc then it will be AGI, at which point the fact that SWE jobs are automated would be the least of everyones worries.
The main problem I perceive with AI being able to do that kind of work is that it requires an unprecedented level of agency and context-gathering. Right now agents are very much like juniors in that they work in an insular, not collaborative, way.
Another big problem is that these higher level problems often require piecing together a lot of fragmented context. If the AI already had access to the information, sure, it would probably be able to achieve the task. But the hard bit is finding the information. Some logs here, some code there, a conversation with someone on a different team, etc. It's often a highly intuitive and tacit process, not easily explicitly defined. There's a reason that defining what a "Senior" is tends to be very difficult.
> When AI can do the software engineering tasks that require expertise outside of coding like system design, scoping problems, cross-team/domain work, etc then it will be AGI
I think you're talking about the really general case, but in my opinion that's not as important. All that matters is that AI solutions manage (in the near future) to cover the average case -- where most engineers actually work -- in a mediocre but cost effective manner, for this to have huge repercussions on the job market and salaries.
> But the hard bit is finding the information. Some logs here, some code there, a conversation with someone on a different team, etc.
I've no problem believing they will become more and more successful at this. This is information retrieval which can be done faster by machines, and making sense of it all together is where advances in AI will need to happen. I think there's a high chance they'll happen eventually, at least in a way that's enough to cobble together projects that will make the leadership happy (maybe after some review/adjustment by a few human experts they retain?). They do not even have to be particularly successful -- how many human-populated engineering projects succeed, anyway?
Also, because the economy is no longer based on competition, but is controlled by a bunch of industry specific oligopolies, even if the bean counters are wrong it won’t matter, because every other company will be similarly inefficient. Everybody loses, but the people in charge are too dumb to know. Our free market is currently broken.