Comment by keeda
12 hours ago
> I swear people should start blacklisting CEOs and refuse to work under them if they're part of the blacklist.
Look at the job market. They know they can get away with it and so they don't care.
My current theory is that this is partly why executives are desperate to get AI to work, and why investors are ploughing billions into AI. They know they've burnt too many bridges, and they need AI to work so they never have to turn to us again. Otherwise the pendulum will swing even farther in the opposite direction, putting even more bargaining power in the hands of employees than the post-COVID job market.
Unfortunately, AI does seem to be working very well, and I don't see great outcomes for us on the current trajectory. I expect turmoil before a new social contract is established.
> Unfortunately, AI does seem to be working very well, and I don't see great outcomes for us on the current trajectory.
The people decreasing headcount are already behind the curve. They're thinking about how many people they need to run things instead of how many people they need to reinvent an industry.
Yes, unfortunately. Each headcount, properly trained and reskilled, is now worth a whole team by themselves! I blame capitalism and the inability to look past the next quarterly earnings statement for what's happening instead.
I think in the (very) long run it will end up being for the best. This will force us lowly serfs to grow beyond our wage-labor mindset and leverage this force multiplier for ourselves.
AI lets those with capital get rid of labor, but by the same token(s ;-)) labor can now achieve outsize results without capital!
It is going to be very uncomfortable, but evolution always is.
It seems AI code is producing technical debt at an alarming speed. What many people think of as "AIs don't need code to be pretty" is misunderstanding the purpose of refactoring, code reuse, and architectural patterns that AIs appear to skip or misunderstand with regularity. A reckoning will come when the tech debt needs to be paid and the AIs are going to be unable to pay it, the same way it happens when humans produce technical debt at a high rate and do not address it in a timely manner.
I agree with this take. The AI is producing tons of debt still, we will see if that pattern holds or if people automate that part into the agents as well.
Actually, AI's do need code to be pretty. It's becoming widely accepted that whatever is good for humans -- modular code, tests, docs, linters, fast feedback loops -- is good for agents.
I keep repeating this, but the latest data from large-scale dev reports like DORA 2025 and DX find that AI is simply an amplifier of your engineering culture: Teams with strong software engineering discipline enjoy increased velocity with fewer outages, whereas teams with weak discipline suffer more outages.
About the only thing they need (for now) is architectural guidance and spot-checking of results. But then how many architects does any given company need?
> why investors are ploughing billions into AI. They know they've burnt too many bridges,
This is a very interesting perspective, I haven't thought of it like that.
Your theory is wrong.
Someone will inevitably have to prompt AIs, CEOs and other executives are NOT going to be doing it themselves. The people driving those AI will have greater leverage as less and less people choose a career in tech.
Also, when an AI fucks up in a way only a human can fix, the human must be available.
What I see more likely is a future where software engineers do even less work but frustratingly you still need them around to fix problems whenever they come up. Kind of like firefighters.
Agreed, and the AI wranglers will be the equivalent of architects and staff+ engineers today, and they will be paid handsomely. BUT! They are a pretty small fraction of the current developer workforce. The remaining junior-to-mid level engineers will have to uplevel themselves while having no opportunity for hands-on experience to do so as they get laid off in bulk.
And note, this pattern is going to repeat across the entire white collar workforce, because the same pyramid scheme holds everywhere in knowledge work.
A new equilibrium will be found, but that will be years, maybe a decade+ away? That's the period of turmoil I am concerned about.
> Also, when an AI fucks up in a way only a human can fix, the human must be available
Actually its more “when an AI fucks up in a way AI that you need a human to take the blame, the human must be available”