Comment by austin-cheney
5 hours ago
From around 2010-2020 the stock market was rewarding growth more than profit. That means large tech employers hired like crazy to indicate growth.
Then came COVID and the economy contracted. As a result the stock market changed to reward profitability. So, excess developers had to go. We are still feeling this.
I do agree that AI is not to blame for this. In fact I will go further and claim that AI is a net negative that make this worse for the employer by ultimately requiring more people who average lower confidence and lower capabilities than without, but I say that with a huge caveat.
The deeper problem is not market effect or panaceas like AI. The deeper problem is poorly qualified workers and hard to identify talent. It’s easy to over hire, and then fire, when everyone generally sucks and doesn’t matter. If the average employed developer is excellent at what they deliver these people would be easy to identify and tough to fire like engineers, doctors, and lawyers. If the typical developer is excellent at what they do AI would be a complete net negative.
AI and these market shifts thus hide a lower level problem nobody wants to solve: qualification.
Qualification is a very difficult problem, but I think everyone resents the characterization of "bad devs". Things like the Metaverse failure - apparently $70bn spent for no results - are primarily management failures. Like the Cybertruck, they succeeded 100% at building a product that the CEO wanted. The problem is that the CEO is basically the only person that wants that product.
There's also the thought nobody wants to examine: what if the consumer market total spend is kind of tapped out?
Yes and no. Excellent developers deliver excellent products, like with the Cybertruck example. I wouldn't buy one either, but it appears to be well crafted.
Stellar developers are one step better. Yes, they too delivery excellent products, but they also produce things of value that nobody asked for. Business examples include Teflon, Postit Notes, antibiotics, linux, git, and much more.
The US Army changed leadership methodologies about 20 years ago to account for this. The current leadership philosophy is called Mission Command. In the fewest possible words a leader provides a stated intent and then steps back to monitor while subordinate leaders exercise their own creative initiative to meet that intent. The philosophy before that was called Military Decision Making Process (MDMP). MDMP required leadership buy in for each step of a process from among a set of discussed courses of action. MDMP is now largely relegated to small personnel teams.
Plenty of people wanted the Cybertruck, it's just that price is too high. It was originally announced to be under $40k, and with incentives, could have been in the $30k's.
The F-150 Lightening had the same problem.
Everyone works with bad devs, and they're addicted to AI.
It wasn't covid - it was the post-covid coming down from all the free stimulous and ZIRP. The Russian war in the Ukraine is a much closer market to when the economy started tanking for devs.
covid was a last gasp for zirp. the interest rates were growing since 2016 and would've remained high if not for the pandemic. covid required a quick return to zirp to save the economy from a crash, then required a quick return to high interest to save the economy from inflation.
if not for covid, the zirp era would end more gently. covid overhiring was the last hurrah for companies to use the low interest. if not covid, there wouldn't be overhiring and subsequent firing
the market would be as bad as now (or dare i say, *normal*), but it would be stable bad, not whiplash.
How exactly are good doctors easy to identify and hard to fire? And how does it follow that AI is a net negative when wielded by professionals who are excellent at what they do?
If people can't identify qualified professionals without relying on credentials, they probably aren't qualified to be hiring managers.
There are a couple of things that identify talent in the qualified space:
* Peer reviews in the industry
* Publications in peer reviewed journals
* Owner/partner of a firm of licensed professionals
* Quantity of surgeries, clients, products, and so forth
* Transparency around lawsuits, license violations, ethics violations, and so forth
* Multiple licenses. Not one, but multiple stacked on top of a base qualification license. For example an environmental lawyer will clearly have a law license, but will also have various environmental or chemistry certifications as well. Another example is that a cardiologist is not the same as a nurse practitioner or general physician.
Compare all of that against what the typical developer has:
* I have been employed for a long time
More elite developers might have these:
* Author of a published book
* Open source software author of application downloaded more than a million times
Those elite items aren't taken very seriously for employment consideration despite their effort and weight compared to what their peers offer.
They are not taken into account simply because they may be meaningless to job advertised.
For example no company I ever worked for (maybe 7 over 20 years) needed publishing academics. Is the fact that you published a book about some DB architecture 10 years ago going to make you a better team member and deliver things faster? Maybe, or maybe not, how could hiring folks know? You measure and judge what you can, ignore the rest since it can be + or - and you have no clue, not even about probability spread.
Doctors with many licenses may be better, or maybe they like studying more than actually working with patients and thus be possibly even worse than average, how do you want to measure that? A stellar doctor may simply have no time to do anything beyond her/his current work due to also dedicating time to their family and raising children. Quality of a doctor, especially surgeons is mostly function of work already performed (knowing closely tens of doctors from all branches they themselves often claim this), not some extra papers. And so on and on.
1 reply →
> And how does it follow that AI is a net negative when wielded by professionals who are excellent at what they do?
Simple, the 80% of code monkeys who are not good at what they do will cause way more damages than the "professionals who are excellent at what they do". And out fo tech I can guarantee you the vast majority of people use llms to do less, not to do more or do better
It's also easily verifiable, supposedly AI makes everyone a 10x developer/worker, it's been ~3 years now, where are the benefits ? Which company/industry made 10x progress or 10x revenue or cut 90% of their workforce ?
How many man hours are lost on AI slop PRs? AI written tickets which seem to make sense at first but fall apart once you dig deep? AI reports from mckinsey&co which use fake sources?
The big wins I've seen with llm tools is in translation more than the actual code.
We have been able to move our "low cost" work out of India to eastern Europe, Vietnam and the Philippines; pay per worker more, but we need half as many (and can actually train them).
Although our business process was already tolerant of low cost regions producing a large amount of crap; seperate teams doing testing and documentation...
It's been more of a mixed bag in the "high skill" regions, we have been getting more pushback against training, people wanting to be on senior+ teams only, due to the llm code produced by juniors. This is completely new, as it's coming from people who used to see mentoring and teaching as a solid positive in their job.
It's really only in the last year the LLM's have gotten great and can output massive blocks of code that are functional.
LLM's are at least a 10x speed up on the 'producing code' portion of the job, but that's often only a fraction of the overall job. There's lots of time spent planning and in meetings, doing research, corporate red tape, etc.
For writing code and unit tests, generating deployment yaml and terraform, for me it's easily a 30x speed up. I can do in 1 or 2 hours what would have previously taken a week.
[flagged]
3 replies →
I think we'll see something counterintuitive happen where hiring picks up dramatically. Companies are willing to overspend and over-hire to automate everything away once and for all.