The tech market is fundamentally fucked up and AI is just a scapegoat

3 hours ago (bayramovanar.substack.com)

>In traditional industries like manufacturing you don’t hire 500 factory workers unless you have a production line that needs them. You don’t over-hire based on a guess.

Traditional factories do make wrong guesses about the future and overhire all the time. Example: https://www.google.com/search?q=Ford+f150+lightning+laid+off...

  • The whole article rests on this false economic claim, that traditional industries don’t overhire based on expectations. They absolutely do, ALL THE TIME. Manufacturing, automotive, airlines, energy etc. all of them make demand bets and lay people off when those bets fail.

    Cheap money amplified this cycle, but this isn’t a tech specific "failure", it’s just how forecasting under uncertainty work.

    It’s incredible how some engineers assume they understand economics, then proceed to fail on some of its most basic premises. This tends to happen when engineering-style certainty is applied to systems that are driven by incentives and uncertainty.

    • I will avoid getting into a “who understands economics better” debate.

      But factory workers usually require specialized machinery, tooling, and physical capacity, which makes overhiring slower, harder and more constrained. Those investments force more deliberate planning.

      By contrast, engineers mostly require a laptop and company hoodie... That low marginal cost makes it far easier to hire aggressively on expectations and unwind just as aggressively when those expectations change.

      18 replies →

    • My take is that the traditional industries you mentioned (manufacturing, automotive, etc.) are making expectations on the market for what it is they're doing. With tech though, it makes expectations on markets that are brand new and/or have no proven track record whatsoever.

      And with that, VR would like to take the microphone and have a few words…

    • Tech companies also benefit from over hiring. It gives them slack to absorb crises, and Fast Fuel to move quickly on new growth. Eliminating the fear of layoffs allows employees to take more risks and explore.

      The current crop of tech companies cutting staff is going to lead to a large number of dead giants. The staff who services the layoffs will be risk averse, and defensible in a job cut situation. You see this in legacy firms where it takes 10 people to make a change because each person has a small slice of permissions required to effect the change. This pattern is by design as laying of any of the ten people on different teams would kill dozens of critical business processes.

      This is not how you make high growth firms.

    • > It’s incredible how some engineers assume they understand economics

      I would say most engineers. The reason is simple - basic economics is not taught in the public schools, and economics/business is not a required course for an engineering degree.

      One of the best classes I ever took was a summer class in accounting.

    • There's also this tech exceptionalism/grass-is-greener bias where software engineers especially tend to romanticize other industries and professions.

    • > It’s incredible how some engineers assume they understand economics, then proceed to fail on some of its most basic premises. This tends to happen when engineering-style certainty is applied to systems that are driven by incentives and uncertainty.

      Dunning Kruger effect, am I right?

      We consider we are smart because we can make computers go beep boop so we know about the economy too. I mean, I am part of this too even though I (or we all) know the effect but I guess my point is that there should be an humility if we are wrong.

      I can be wrong, I usually am. If someone corrects me, I would try to hopefully learn from that. But let's see how the author of this post responds to the GP's (valid, from what I can tell) critique.

      Edit: Looks like they have already responded (before I wrote it but I forgot to see their comment where they said that its not at the scale or frequency we see in tech)

      2 replies →

    • > Cheap money amplified this cycle,

      This is more-or-less the rub.

      Fortunately we had a kinda-sorta sane monetary policy under the Biden administration once the pandemic started to ebb, but now we've got Mr. Appearances trying to push Jerry Powell to make stupid mistakes.

  • Sure, factories also overhire sometimes or shrink their businesses but not at the scale or frequency we see in tech.

    • Only because their margins usually mean that those who do go out of business much sooner.

    • that's true but does tech generally hold a more aggressive default-growth stance?

    • I've never worked in tech, but I've worked at manufacturers of various different sizes and places in the manufacturing supply chain, including finished goods.

      Manufacturers can't hire beyond the places in production that someone can stand and do something. There needs to be some kind of equipment or process for worker to contribute in some meaningful way, even if it is merely for a projection of increased production (e.g., hiring a second shift for a facility currently running one shift).

      What I wonder is if in tech, the "equipment" is a computer that supports everything a developer needs. From there, new things can be added to the existing product.

      Manufacturing equipment is generally substantially more expensive than a computer and supporting software, though not always. Might this contribute to the differences, especially for manufacturing that normally runs 24-hour shifts?

    • Manufacturing sector has the most job cuts over time because the prediction of China not surpassing them was wrong. Tech employment on the other hand almost never decreased.

      1 reply →

Anecdotally, I didn't see more safety for engineers working on cash cows. Quite the contrary, if a product is already bringing revenue, it is an easy decision for the business to squeeze a bit more margin by letting people go. Stable, cash-generating projects are often first to be put into efficiency/maintenance mode.

  • > Stable, cash-generating projects are often first to be put into efficiency/maintenance mode

    Or they're actively enshittified, aiming to extract more short-term revenue at the cost of a long-term future...

  • It may be not such a bad idea because many engineers have an itch for fixing things that aren't broken.

    • It is rather that many software developers see how bad the code is, and thus attempt to reduce the code debt if possible. I have rarely seen software developers fixing things that aren't broken (though it is often not easy for managers and people who are not deeply knowledgeable about the project to see why what is there is broken).

      On the other hand, I have seen politically very adapt software developers who actually rather want to managers to advertise some technology that they would love to introduce in the projects.

  • Also anecdotally, I've had a handful of software development positions throughout the years (never at a position with more than 200-300 person company) and have yet to be laid off due to money. I've yet to be laid off at all, but that's irrelevant.

    I truly believe that these new tools will actually hurt the bigger companies and conversely help smaller ones. I'm in healthcare. The big players in the EMR space are Epic and Cerner. They are one-size-fits-all behemoths that hospitals have to work against than with. What if, instead of having to reach out to the big players, the economics of having a software developer or 2 on staff make it such that you could build custom-tailored, bespoke software to work "with" your company and not against?

    • Aren't they still going to need to reach out to the big players because of the regulatory environment? And for good reason, as it happens. We don't need hospitals handing over the public's health data to the cheapest person they can find to prompt it all into Claude.

      3 replies →

    • > What if, instead of having to reach out to the big players, the economics of having a software developer or 2 on staff make it such that you could build custom-tailored, bespoke software to work "with" your company and not against?

      It's probably risk and liability and not development costs that keep things from moving in house. Not things AI is great at mitigating.

      3 replies →

    • > What if, instead of having to reach out to the big players, the economics of having a software developer or 2 on staff make it such that you could build custom-tailored, bespoke software to work "with" your company and not against?

      The behemoths exist especially, but not exclusively, in that space because regulations (correctly) are steep. In the case of hospital systems you're talking both the management and protection of both employee and patient data. That's not to say of course that the behemoth's are particularly good at that, it's merely that if the hospital rolls it's own solution, as you suggest, they then take on the liability should that system go wrong. On the other side, if Epic has a data breach, every hospital shrugs it's shoulders. It isn't their problem. And, even more fundamentally, if Epic as a product sucks ass... well. The employees didn't choose it, neither did the patients, leadership did.

      You see these relationships (or lack thereof) all over the place in our modern world, where the people doing the work with these absurdly terrible tools are not given any decision-making power with regard to which tools to use. Hell, at my workplace, we actually have some in that leadership asks if we're happy with our various HR softwares and things, but fundamentally, they all pretty much suck and we're currently sitting at the least shitty one we could find, which is far from a solid fit for our smaller company. But it's the best we can do because none of these suites are designed to be good for people to use, they're designed to check a set of legal and feature checkboxes for the companies they sell to.

      Honestly I don't know how you fix this, short of barring B2B SAAS as an entire industry. Time was, when you wanted to run a sales company, you had to run your own solution to keeping track of internal data. Salesforce didn't exist. You had rows upon rows of file cabinets, if there was a fire data was a lost, if a disgruntled worker stole your sales list and sold it to a competitor, that was it's own issue to deal with. Now crooks can crack the locks off of NetSuite and steal your whole fucking business without even knowing where the hell your HQ even is or caring for that matter, and our business universe if you will is bifurcated all to hell as a result. Companies are engaged in constant games of "pin the legal responsibility on someone else" because to compete, they need internet and software based sales and data management systems, but building those systems is a pain in the ass, and then you're responsible if they go wrong.

Interesting framing. The article makes a compelling case that we're seeing the hangover from 14 years of ZIRP-fueled hiring rather than an AI apocalypse.

But I'm curious what people think the equilibrium looks like. If the "two-tier system" (core revenue teams + disposable experimental teams) becomes the norm, what does that mean for the future of SWE as a career?

A few scenarios I keep turning over:

  1. Bifurcation - A small elite of "10x engineers" command premium comp while the majority compete for increasingly commoditized roles                                                       
  2. Craftsmanship revival - Companies learn that the "disposable workforce" model ships garbage, and there's renewed appreciation for experienced engineers who stick around                 
  3. Consulting/contractor becomes default - Full-time employment becomes rare; most devs work project-to-project like other creative industries                                              
                                                                                                                                                                                              

The article argues AI isn't the cause, but it seems like it could accelerate whatever trend is already in motion. If companies are already treating engineers as interchangeable inventory, AI tooling gives them cover to reduce headcount further.

For those of you 10+ years into your careers: are you optimistic about staying in IC roles long-term, or does management/entrepreneurship feel like the only sustainable path?

  • As someone with many animator friends, this sounds very bleak. Their work processes are very similar to software engineering, with the difference that their hiring process is much quicker (they just show a bunch of reels and shots they made for previous films or cartoons, and they’re hired). The sad part is that they have almost no labor rights, and competition is incredibly high, which means pay is very low and turnover is high. All these years, I’ve been setting my expectations that one day my field may become like that.

  • I am 10+ years into my career. I don’t think mgmt / entrepreneurship feels like the only sustainable path. But I believe I may become a manager of a 5-10 Claudes.

    • What are your Claudes going to be building and who is going to fund it?

  • "Consulting/contractor becomes default - Full-time employment becomes rare; most devs work project-to-project like other creative industries"

    I'm in the tech industry and have been doing this for 12+ years now. In the beginning, it was because I wanted to live overseas for a few years, without a break in my career.

    Now, it's about survival. I buy my own health insurance (me and my family) in the marketplace every year (so I'm not tied to an employer), work with multiple clients (I never really have to worry about getting laid off), and make much more than a FTE.

    While all my friends in tech are getting laid off or constantly in fear of getting laid off, I don't have to worry.

    I also find that because I touch so many different technologies, I have to turn down work. I turned down a company last year, that wanted me in-house and one this year that would have been too demanding on my schedule.

    It's also flexible and always remote.

From around 2010-2020 the stock market was rewarding growth more than profit. That means large tech employers hired like crazy to indicate growth.

Then came COVID and the economy contracted. As a result the stock market changed to reward profitability. So, excess developers had to go. We are still feeling this.

I do agree that AI is not to blame for this. In fact I will go further and claim that AI is a net negative that make this worse for the employer by ultimately requiring more people who average lower confidence and lower capabilities than without, but I say that with a huge caveat.

The deeper problem is not market effect or panaceas like AI. The deeper problem is poorly qualified workers and hard to identify talent. It’s easy to over hire, and then fire, when everyone generally sucks and doesn’t matter. If the average employed developer is excellent at what they deliver these people would be easy to identify and tough to fire like engineers, doctors, and lawyers. If the typical developer is excellent at what they do AI would be a complete net negative.

AI and these market shifts thus hide a lower level problem nobody wants to solve: qualification.

  • Qualification is a very difficult problem, but I think everyone resents the characterization of "bad devs". Things like the Metaverse failure - apparently $70bn spent for no results - are primarily management failures. Like the Cybertruck, they succeeded 100% at building a product that the CEO wanted. The problem is that the CEO is basically the only person that wants that product.

    There's also the thought nobody wants to examine: what if the consumer market total spend is kind of tapped out?

    • Plenty of people wanted the Cybertruck, it's just that price is too high. It was originally announced to be under $40k, and with incentives, could have been in the $30k's.

      The F-150 Lightening had the same problem.

  • It wasn't covid - it was the post-covid coming down from all the free stimulous and ZIRP. The Russian war in the Ukraine is a much closer market to when the economy started tanking for devs.

    • covid was a last gasp for zirp. the interest rates were growing since 2016 and would've remained high if not for the pandemic. covid required a quick return to zirp to save the economy from a crash, then required a quick return to high interest to save the economy from inflation.

      if not for covid, the zirp era would end more gently. covid overhiring was the last hurrah for companies to use the low interest. if not covid, there wouldn't be overhiring and subsequent firing

      the market would be as bad as now (or dare i say, *normal*), but it would be stable bad, not whiplash.

  • How exactly are good doctors easy to identify and hard to fire? And how does it follow that AI is a net negative when wielded by professionals who are excellent at what they do?

    If people can't identify qualified professionals without relying on credentials, they probably aren't qualified to be hiring managers.

    • There are a couple of things that identify talent in the qualified space:

      * Peer reviews in the industry

      * Publications in peer reviewed journals

      * Owner/partner of a firm of licensed professionals

      * Quantity of surgeries, clients, products, and so forth

      * Transparency around lawsuits, license violations, ethics violations, and so forth

      * Multiple licenses. Not one, but multiple stacked on top of a base qualification license. For example an environmental lawyer will clearly have a law license, but will also have various environmental or chemistry certifications as well. Another example is that a cardiologist is not the same as a nurse practitioner or general physician.

      Compare all of that against what the typical developer has:

      * I have been employed for a long time

      More elite developers might have these:

      * Author of a published book

      * Open source software author of application downloaded more than a million times

      Those elite items aren't taken very seriously for employment consideration despite their effort and weight compared to what their peers offer.

      2 replies →

    • > And how does it follow that AI is a net negative when wielded by professionals who are excellent at what they do?

      Simple, the 80% of code monkeys who are not good at what they do will cause way more damages than the "professionals who are excellent at what they do". And out fo tech I can guarantee you the vast majority of people use llms to do less, not to do more or do better

      It's also easily verifiable, supposedly AI makes everyone a 10x developer/worker, it's been ~3 years now, where are the benefits ? Which company/industry made 10x progress or 10x revenue or cut 90% of their workforce ?

      How many man hours are lost on AI slop PRs? AI written tickets which seem to make sense at first but fall apart once you dig deep? AI reports from mckinsey&co which use fake sources?

      6 replies →

  • I think we'll see something counterintuitive happen where hiring picks up dramatically. Companies are willing to overspend and over-hire to automate everything away once and for all.

I'm not arguing against the core arguments of the article (I agree with most of the points), but on the other hand, software development is (at least currently) an essentially iterative process - one that differs greatly from other production processes (e.g. buildings, cars). We all know how difficult it is to estimate how much development time something takes. Planning is hard and outcomes have therefore greater variability.

  • I generally agree with you, but if you look deeper, cars, buildings, and the underlying know-how didn’t appear in a day either.

    Those were also iterative processes: first tires and mud houses, then horse carriages and brick houses, and eventually cars and buildings.

    In that sense, it’s not fundamentally different from engineering today. Working on core engineering functionality of a company is essentially the same kind of process.

    The difference lies in whether you’re working on core functionality, or on some iterative experiment that nobody knows will succeed.

    • When it comes to cost rebuilding, we can't compare the software engineering with other industries(like cards or buildings). I think this makes it much more iterative compared to them. Living in North America, if feel like 99% of aparatments and houses can be grouped into 5-10 floor plans. I think thats because when you are designing a new building or house you really can't do much risk. You do what has already worked. Software also have trends, but they change so often. You also can't do A/B testing or targeting or measure every single interaction potential customer has with your product. The nature of building "Software" really brings so many options to the table which increases the number of iterations by order of magnitude.

    • I'm talking about how a single product is produced, not about its evolution through centuries. Anyway, the point wasn't to compare specific details, it's just an analogy.

  • Your aren't building a car when you are writing software, you are building a car factory.

Mind you that the IT over investment sucked money out of other industries. My friends who chose different career paths do not seem to be particularly content either.

  • It's insane how many billions went into idiotic "tech" companies like wework when it could've been invested somewhere with an actual outcome or benefit for society.

    Let's not even bring the gig economy into discussion..

>In traditional industries like manufacturing you don’t hire 500 factory workers unless you have a production line that needs them. You don’t over-hire based on a guess.

This is interesting, in my experience its seemed to be the opposite.

In manufacturing, its must easier to come up with a specific number of employees you need for a given project or contract.

If the contract is expected to sign you may hire early to get ahead of it.

If the contract falls through, or an existing contract is cancelled, you know exactly how many people you need to cut to balance labor with your current commitments.

IMO, this is a not-wrong but less insightful perspective on what Ed Zitron talks about in The Rot Economy. Cycles of overhiring and layoffs happen, but they're not the core mechanism. AI isn't the cause, but it's also not a scapegoat. It's just the current placeholder, like blockchain and the metaverse, that allow companies to have their valuation based on growth rather than profit.

https://www.wheresyoured.at/the-rot-economy/

  • Right, AI isn't the reason, but it sure as hell is an accelerant and part of the pattern that ultimately is extremely unsustainable. It's going to be really ugly when it ends but it's inevitable.

Could this not be simply summarized as ZIRP fueled a bubble in tech long before AI? Mass layoffs have been cycling on and off since ZIRP stopped. I doubt we will see that change anytime soon.

Having stable core revenue teams and disposable experimental teams seems like an excellent way to ensure the experienced developers will leave as soon as they can, and that any new knowledge leaves the company as soon as possible.

So in the end you lose both your ability to acquire new knowledge and to keep existing knowledge.

There has to be a better way surely?

The main issue IMHO is the monopolization of the industry, especially in the US. Once the giants do layoffs, the rest of the market can't absorb the people effectively, which leads to oversaturated job market.

We can of course discuss how many people got into industry during COVID heyday and whether they should have, but mostly I think it's about those behemoths having disproportionately high impact on the entire labour market.

This tracks, reminds me of Cory Doctorow's talk on reverse centaur situation where he gave a nice rundown of the tech market of the past 15+ years.

Do anything and everything to remain in "growth stock" category. Spend money on useless features, on engineers working on those useless features - as long as it will make your company look like it has bright future and space to grow.

  • Wasn’t familiar with this, tracked it down

    https://locusmag.com/feature/commentary-cory-doctorow-revers...

    > There’s a bit of automation theory jargon that I ab­solutely adore: “centaurs” and “reverse-centaurs.” A centaur is a human being who is assisted by a machine that does some onerous task (like transcribing 40 hours of podcasts). A reverse-centaur is a machine that is assisted by a human being, who is expected to work at the machine’s pace.

This is just scarcity based economics failing us in the ways that it has been for decades now. The only thing that stands out about tech is that from 2010 to 2020 it seemed to be withstanding the rot.

Proven businesses can always borrow at fairly low rates. When capital gets really cheap it starts to incentivize greater amounts of risk taking because investors actually want risk.

If you are starting a dry cleaning business, you have a cost of the equipment, rent and other well known factors. Starting a tech company in a new and unproven area has different expenses and a different risk/reward profile.

Malinvestment can come in a lot of flavors. Cheap capital will result in too many dry cleaners and also too many startups that probably shouldn't have gotten funding.

The downside comes in various forms. 1) existing dry cleaning businesses are less profitable because of increased competition, and 2) startups hire scarce engineers and drive up wages, which drive up costs for everyone.

Cheap capital is justified bc the goal is growth, but it is a blunt instrument that creates hot spots and neglected areas simultaneously. Compare the approach used in the US with the approach taken by China. Chinese firms face significantly more competition than firms in the capitalist US, but overall China's policies are crafted with a more deliberate eye toward their distributional consequences and notions of the greater good are much more subject to sharp critique and pressure across social and industrial strata.

What we are seeing in the US is that policymakers have come to believe that the growth-focused approach is an escape hatch that can be used to reduce the effects of other bad decisions, but at some point the size of the inflated economy gets big enough that it takes on a political life of its own -- post-911 defense contractors have dramatically more lobbying and policy-influencing power than they had prior. Today, systemically risky financial industry participants have significantly more political clout than they had before the 2008 correction.

In other words, the fabric of (political) reality shifts and it becomes hard to identify what normal would look like or feel like. In my view, AI adds fuel to the existing fire -- it rapidly shifts demand away from software engineers and onto strategists -- give the team a strategy and now with AI the team will have it done in a few weeks. If not, a competitor will do it without poaching anyone from your team.

And market forces include both creative and destructive forces. Firm failure is a feature, not a bug.

  • Cheap venture capital is uniquely driven by the interest rate more than any other factor. Low interest rates drive money away from safer vehicles towards more risky vehicles because they still offer a return. This is good far people starting companies, but in the long run the decision makers on those investments almost always turn out to have mis-priced the risk factor and end up with negative returns.

    This then causes the market to dry up again and if the interest rate hasn't dropped even further then a lot of companies that need follow up investment will now get killed off. It's a very Darwinian landscape that results from this and I've been wondering for years if there isn't a better way to do this.

    • Excellent points. You've perfectly described the brutal, interest-rate-driven VC cycle -- capital floods in, risk gets mispriced, and the market eventually corrects with Darwinian force. That's the "blunt instrument" in action.

      Meanwhile in China, the approach is fundamentally different. Capital isn't just cheap; it's strategically directed by the state with goals beyond financial return. The aim is "new quality productive forces" -- slow-burn, systemic growth that reinforces social stability and industrial upgrade, not a boom-bust race for unicorns.

      The current AI boom is our real-time experiment to see if this is the "better way." The U.S. model, as you note, is driven by massive private investment (over $109B in 2024) and is prone to hype cycles. China's model is state-planned, focusing on the "AI Plus" integration of technology across its industrial base, despite investing less ($9.3B) and facing constraints like advanced semiconductor access.

      We're watching two competing logics: one seeking market-defining breakthroughs through volatile, capital-intensive competition, and another pursuing broad-based, stability-oriented technological integration. The results of this test will show which system better transforms capital into lasting, system-wide advantage.

Ofcourse, AI does have an impact. We can't close our eyes to that. Atleast, it created a belief that massive layoffs are affordable now without affecting productivity.

I don't really understand what the author is arguing for here. Yes, tech companies (and everyone else) hired a ton of people when money was cheap, and now they are doing lay offs. I suppose the alternative would be employing people for life? Or that it should be costlier to fire people (ie like in Europe)?

Also, if you are using AI to even write such a simple blog post, then perhaps corporations are indeed using it for all kinds of purposes too, and that undoubtedly reflects on their hiring.

> Most engineers (including me) spent months grinding LeetCode at least twice in their career, studying system design, and passing grueling 6-round interviews to prove they are the “top 1%.”

Really grateful that the opportunities I've been given weren't predicated on knowing things completely irrelevant to my job. I have spent exactly zero time solving LeetCode problems in my career (beyond algorithm stuff in college).

Gravity is coming back to Silicon Valley: workers are realizing that the Bell Labs image they were sold was mostly innovation theater and hoarding talent for websites overstuffed with ads designed to manipulate users into buying junk

Unfortunately, I think the next head of the fed is going to be appointed specifically to reduce interest rates, so we’re probably just going to go back to the 0-rate trough.

  • Quite likely. This will be great for growth.

    .. and terrible for inflation, but that can be blamed on other people.

    • I don't see how this can happen, tbh. Like, the chair is just one vote, and the regional Feds have almost a majority. Presumably, whoever gets the job will say they'll reduce interest rates, but I don't see how they can actually accomplish this without getting the rest of the Fed on board.

    • Yes, it will, as always, be blamed on wages, even when wage growth is slower than inflation.

Don’t think it needs to be either or.

ZIRP, AI, over hiring, and a wave of boot camp labour supply I suspect all contribute.

Plus we’re also likely approaching saturation on a lot of fronts with attention and ad density saturation. Things like YouTube seem to be on the edge of how much ads they can force feed without people just not using yt because it’s unusable. Enshitification isn’t a cycle, it’s a one way trip. Combine that with over hiring and it’s bound to hit a wall

  • But on the other side there are countless opportunities for valuable revenue-generating products that aren't based on dopamine addiction and/or ad tech.

    It's a values problem, not an opportunity problem. Tech industry management is locked in a death spiral because it lacks the ethics or the vision to create real value, as opposed to extractive predatory value.

    This applies equally to consumer relationships and employee relationships. The default value is "How can I screw these people over to get more of what I want?"

    And that is - ironically - a failure of brain chemistry and emotional regulation.

    Because making number go up is a catastrophic addiction in its own right.

If you make a prediction that the stock market is about to collapse every year, then one year you will eventually get it right.

  • The market can remain irrational longer than you can remain solvent. But why do you think gold has tripled in price?

> The result is the worst of both worlds. European engineers now face US-level job insecurity with European-level compensation and limited mobility.

Firing people in most of Europe is still not as easy as it is in the US.

The opposite is also true, it's not that easy to leave your employer and you have to give 1/3/6 months notice before leaving, depending on your role/seniority/contract.

Sometimes companies even make you sign 12 months notice contracts clause where they pay you a fixed monthly bonus but you can't leave without giving a 12 months notice, my SO has signed one.

  • Indeed. It's not impossible to lay off people in Europe but they can't just say "you're fired!". There's a process and it costs time and perhaps 3 to 6 months of salary and you have to prove the layoff is needed. The business is incentivized to reduce hiring and find other work for the employee to do, instead.

  • > This is just false, firing people in most of Europe is still not easy.

    Down sizing is a perfectly legal reason to fire people in Europe, and it happens all the time when big companies do mass firings. The difficult part is getting to choose which individuals to fire.

    • 1. Most of Europe make it hard to do layoffs and most countries have different systems to prevent mass layoffs (including a state contribution to the salary, the Italian Cassa Integrazione is an example). Down sizing needs to demonstrate that you need real reasons for it, which implies demonstrating financial issues and lack of work and inability to reassign/retrain to different roles. Not easy.

      2. American companies find every time that doing layoffs is very hard for them in Europe. Most recently Amazon, which, unable to lay off their people in Milan went through secondary tactics like demanding Return To Office (from people that signed hybrid or fully remote contracts) and other tactics involving mobbing or generous severance packages (up to 1 year in salary). Still, if the worker said no it was no.

> The liquidity that flooded the tech sector didn’t just inflate valuations; it inflated teams, egos, and expectations.

Yes it's kind of obvious to anyone who's looking at the actual work being done: the constant churn of OS updates, the JS-framework-du-jour, apps being updated constantly...

It seems to me like a lot of this is just busy work, as if engineers need to justify having a job by being releasing inconsequential updates all the time. Bullshit jobs anyone?

I for one would really like things to slow down, we all deserve it!

  • Because it's important to recognize sometimes when someone you disagree with is right about something, I would like to note that Musk sacking most of the Twitter staff has not made the site unable to stay up. (The site has got worse for other reasons)

There is some fat in the system that likely needs to cut off.

Here is Google complaint about not serving lobster biscue:

https://x.com/Andercot/status/1768346257486184566?s=20

Zero interest rate phenomenon.

  • This is likely the first time many have not experienced a widespread economic crash since 2008.

    We could get another 2008-like market crash before 2030, once the major AI companies begin to IPO onto the public markets.

I will add some commentary from my subjective POV in IT:

“Efficiency” is a damned lie. Enterprise IT is one of the most inefficient spaces out there, full of decades of band-aids layered atop one another in the form of fad products, fancy buzzwords, and “defining leadership” projects. The reason you cannot get shit done at work quickly isn’t because of bureaucracy or management layers standing in the way so much as it’s the vested interest in weakening IT and Ops teams so that those higher-ups can retain more of the profit pie for themselves.

My entire job is to make technology become so efficient that it fades into the background as a force amplifier for your actual work, and I’ve only spent ~1/3rd of my 15+ year career actually doing that in some form. I should be the one making sure documentation is maintained so that new hires onboard in days, not months. I should be the owner of the network and/or compute infrastructure the business needs to operate, not some MSP in another country whose contract you’ll replace with a lower bidder next year. I should be the one driving improvements to the enterprise technology stack in areas we could benefit from, not some overpriced consultant justifying whatever the CIO has a hard-on for from his recent country club outing.

Consultants, outsourcing, and fad-chasing aren’t efficient. They do not better the business, overwhelmingly. AI won’t magically fix broken pipelines, bad datasets, or undocumented processes, because it is only ever aware of what it is told to be aware of, and none of those groups have any interest or incentive in actually fixing broken things.

The tech industry is woefully and powerfully inefficient. It hoards engineers and then blocks them from solving actual problems in favor of prestige projects. It squanders entire datacenters on prompt ingestion and token prediction instead of paying a handful of basically competent engineers a livable salary to buy a home near the office and fucking fix shit. Its leaders demand awards and recognition for existing, not for actually contributing positively back to society - which leads to stupid and short-sighted decision-making processes and outcomes.

And all of this, as OP points out, is built on a history of government bailouts for failures and cheap debt for rampant speculation. There’s no incentive to actually be efficient or run efficient businesses, and this is the resultant mess.

  • I think you need some kind of source to back up the idea that IT or the software industry as a whole isn’t efficient.

    Software companies have much higher profit margins than companies that ship physical products. There really aren’t many industries that do better margins than software.

    To sell software, you don’t need a production facility, warehouse, nor do you even need an office building if you don’t want one.

    https://pages.stern.nyu.edu/~adamodar/New_Home_Page/datafile...

    • …bruh. Literally the very first line:

      > I will add some commentary from my subjective POV in IT:

      Subjective is doing the carrying, there. I am admitting up front that this is specific to me, my career, and the specific life experiences I’ve had with it thus far.

      Like…I won’t even entertain the rest of your comment if you’re not even going to read the entirety of mine before vomiting out an “UhM aHkShUaLlY” retort.

      1 reply →

Personally I think all these posts miss 50% of the issue... I agree it's not AI, but I suspect it's only partially an interest rate story.

Tech changed a lot from 2010 to 2020. Prior to 2010 almost everything built required a huge amount of development effort, and in 2010 there was still a huge amount of useful stuff to be built.

Remember – prior to 2010 a lot of major companies didn't even have basic e-commerce stores because the internet was still a desktop thing, and because of this it really only appealed to a subsection of the population who were computer literate.

Post 2010 and post iPhone the internet broadened massively. Suddenly everyone was online and companies now had to have an e-commerce store just to survive. Only problem was that there wasn't a Shopify or even npm to build from... So these companies had to hire armies of engineers.

Similarly there was no Uber, online banking was barely a thing, there was no real online streaming services, etc, etc, etc...

During this time almost everything had to built by hand, and almost everything being built was a good investment because it was so obviously useful.

Around 2015 I realised that e-commerce was close to being a solved problem. Both in how most major companies had built out fairly good e-commerce stores, and also in how it was becoming relatively easy for someone to create an e-commerce store with almost no tech skills with solutions like Shopify.

I'd argue somewhere between 2010 and 2020 the tech industry fundamentally changed. It become less about building useful stuff like search engines, social media sites, booking systems, e-commerce stores, etc – these were the obvious use cases for tech. Instead the tech industry started to transition to building what can only be described as "hype products" in which CEOs would promise similar profits and societal disruption as the stuff built before, except this time the market demand was much less clear.

Around this time I noticed both I and people I knew in tech stopped building useful stuff and were building increasingly more abstract stuff which was difficult to communicate to non-technical folks. If you asked someone what they did in tech around this time they might tell you that their company are disrupting some industry with the blockchain or that they're using machine learning pick birthday cards using data sourced from Twitter.

I used to bring this up to people in tech but so many people in tech at this time had convinced themselves that the money was rolling in because they were just so intelligent and solving really hard problems.

In reality the money was rolling in because of two back to back revolutions – the internet and the smart phone. These demanded almost all industries made a significant investment in technology, and for a decade or so those investments were extremely profitable. Anyone working in tech profited from those no-brainer technical investments.

Post-2015 the huge amount of capital in tech and the cheap money allowed people to spend recklessly on the "next big thing" for many years. 2015 to 2020 was such an amazing time to be in tech because people were basically throwing money at you to build literally anything.

But time's up now. Companies are realising that a lot of the money they invested in tech in recent years isn't profitable and isn't even that useful. So now they're focusing in on delivering value and building up profit margins.

The tech market isn't broken, it's coming back down to reality. Like railway workers post the boom we must face that most of the core infrastructure has now been built. A few of us will stick around making the odd improvement and maintaining what's already there, but that boom isn't coming back. Many of us will need to seek new professions.

  • I mostly agree with your assessment of the industry. However, I think there are still more new and useful products to be built. They are not “the next big thing” though. Big tech Management has been screwing this up in a couple of ways though.

    1. prioritizing bets for things that could be as profitable as social media or e-commerce instead of betting on more incremental improvement products.

    2. Focusing on pricing everything with reoccurring revenue and thus increasing the lifetime cost for end users instead of selling products at a discrete costs and providing end users value

    3. Optimizing for growth and controlling the vision of products instead of letting small groups of talented people slowly build products.

    4. Treating people as fungible resources and moving them around all the time rather than letting people develop unique expertise skillsets.

    As a result, any product that can’t achieve $10+ billion annual revenue within a couple of years with a ship of Theseus team is deemed a failure and scrapped.

  • Great comment. I’m thinking along similar lines — what is there in tech to build? And the answer is, not much for the current cohort of services.

    Big product companies with big products are solved problems in their respective fields. Building an Amazon or a Facebook is quite a lot of work. Maintaining it is much less work.

    For a while the industry has done a thing where you do e.g. infrastructure in five different ways across ten different teams across three departments. It created a lot of “work” but it didn’t create much additional value.

    Another instance of this was myriad “internal products” with the idea that some of them will be blockbusters because Paul Buchheit built Gmail as a 20% project in days of yore. That didn’t go so well, either.

    You get the feeling that all this merry but ultimately futile kerfuffle was done to fuel the hype of growth but the actual job positions were completely uncoupled from revenue growth. For a time this was hard to see while global expansion was happening. Revenue was growing rapidly and so was headcount. It seemed to check out, arithmetically, but it’s not sensible. It doesn’t take twice as many workers to service twice as many employees in this industry.

    When the global expansion didn’t have anywhere else to expand to and revenue stopped growing, the workforce-sustaining illusion fell apart. Now those companies are unloading everyone but the skeleton crew it takes to maintain the products. That’s a lot of people.

  • Brother, the tech market is coming up to reality.

    You and your good reasoning, but it means nothing to me.

    Many of us will seek new a profession indeed: AI engineer. AI engineers mostly write code, but they are superior to software engineers...from the eyes of investors.

    AI engineers mainly build abstract stuff: SaaS. They kill who the investors currently love the most: managers. 90% all managers will be replaced. 100 managers can be replaced with 10 managers and 10 security guards.

    Then they kill 70% of the doctors, lawyers, and salesmen.

    Then they kill 50% of the software engineers.

    Then white becomes blue, and the eyes shift towards energy, transportation, and construction. The AI engineer, writes code like an AI, and notices these giant companies are sick. They can lobby all they want but the AI engineer and the AI lawyer will win.

    Everybody who laughs now about software engineers being replaced has become serious, for the AI engineer has replaced the software engineer.

    Let's see who is right in 2035. Your intelligence, or my fantasy.

When the unwind came, companies cut the layers that existed because hiring was easy, not because the work was essential. In a tighter market, the advantage shifts back to people with real systems experience and operational depth

> But in Tech, the playbook is different. Companies over-hire software engineers intentionally. To play the lottery.

The actual reason tech companies overhire is because people get promoted based on the number of people that are "under" them. All leaders are incentivized to fight for headcount.

I don't really understand this writer's objection: Big tech is big money and big risks. If some giant FAANG company is going to take a gamble on paying you $350,000 TC you should be squirreling most of that away for yourself in case the bet goes south.

If you want stability, go write Java for an insurance company for $85,000/year in Hartford, CT.

OP is horrified to discover risky gambling happening in Las Vegas.

  • hey I am the writer of the post anecdotally, my last job was an insurance company I got laid off from there too :)

solid analysis but i think you're missing the logical endpoint here: this doesn't end with companies "relearning scarcity"... it ends with the permanent contractor-ification of various types of work at these tech companies (not just tech roles, but other types of roles at these companies). already, contractor-to-employee ratio has gotten higher and higher at these companies in recent years and I expect this to continue.

ZIRP (especially the "double tap" ZIRP in 2021/2022) created this monster (bootcamp devs getting hired, big tech devs making "day in the life of" tiktok vids).

contractors give:

instant scale up/down without layoff optics

no benefits overhead

no severance obligations

easy performance management (just don't renew)

this mirrors what other industries typically do after large restructuring waves ... manufacturing got temp agencies and staffing firms as permanent fixtures post-rust belt collapse. tech is just catching up to the same playbook.

> Europe just became a lower-cost extension of Silicon Valley.

Pretty much spot on. The UK included in the above definition of Europe as well.

> Yesterday, the news of 16k Amazon layoffs plus two LinkedIn posts on the same topic back-to-back encouraged me to finally write about it.

Amazon is fundamentally a logistics + robotics company and is one of the worst companies to join for 'stability' as they have razor-thin margins.

With almost 1.6M workers, the layoffs there are at least in the 5 figures and they will not stop to do the easiest thing possible in order to increase profit margins and that is to take jobs away from warehouse workers (using robots) and corporate jobs (using AI agents).

> Most engineers (including me) spent months grinding LeetCode at least twice in their career, studying system design, and passing grueling 6-round interviews to prove they are the “top 1%.”

Leetcode can be easily gamed and cheated and is a waste of time.

Now you need to make money for yourself instead of dancing around performative interviews since an AI Agent + Human out performs over 90% of workers doing the mundane work at Amazon anyway.

You are being scammed without knowing it.

  • > Amazon is fundamentally a logistics + robotics company and is one of the worst companies to join for 'stability' as they have razor-thin margins.

    You seem to forget that AWS exists.

  • They have an 11% operating margin. That is far from razor thin. Their biggest business is AWS. They have subscription income, and they sell digital downloads and streaming which are high margin.

    https://s2.q4cdn.com/299287126/files/doc_financials/2025/ar/...

    • Yeah, nobody makes this “margins are too low” complaint about Costco, which makes the majority of its profit on memberships, the exact same business model as Amazon Prime.

      Amazon the logistics company is paid for by the $100+ per year that its customers just give to Amazon to get basically nothing in return.

    • All thanks to previous layoffs since 2022 which they cut 27,000 jobs in 2022 to 2023 due to mass over-hiring.

      Compared to the other FAANG companies, these margins not only thin, but terrible and Amazon has the worst margins out of FAANG regardless of AWS or not.

  > Till ~2010, a layoff was a sign of failure. It meant the CEO messed up.
  > 
  > In 2024, a layoff is a signal of “discipline.” Companies lay off thousands, and their stock price jumps.

Citation needed. Author started their career after 2010 so they are not basing that on personal experience. In my experience this is not true.

Is this just for the big guys like FANG, AI companies?

There is huge part of tech market that is just humdrum blue color software. Keep the ERP system running, build a new efficiency report, trouble shoot why the payroll missed bob last week because of a un-validated text entry field.

Or because of the years of zero interest, tons more people went into software, so now it is over-populated, and thus puts pressure on regular hum-drum software jobs.

Mostly true, but AI accelerated that process:

- Senior Engineers now often is sufficient for most tasks, Junior Engineers seems like a burden rather than a boost during development process

- Companies feel comfortable with hiring fast and firing fast

- Tech Market is now flooded with not-so-good engineers having good experience with good AI coding assistants - which already are capable of solving 80% of the problems - they are ready to work for much less than really experienced engineers

In general, yes, companies overhired many software developers, hoping they will continue hyper growing, but then reality has kicked in - this was not sustainable for most businesses.

Should information technology be a stable employment sector?

As far as I'm concerned, the main purpose of IT is to automate work. Tech companies make these systems of automation and provide them to other industries, so they can automate.

Making a program or an IT system is something you only do once. So once it is completed, it is expected that a lot of people who helped make it have to go. It's like building a skyscraper. Massive amounts of work to build it, and when it's finished, most workers have to move on.

Of course an IT company can continue to expand into perpetuity, but what if they don't have the leadership talent or resources to create a new giant project after one has been finished? Then the sensible thing is to down-size.

"Well don't hire too many people in the first place to rush your project into completion" - Then you get left behind.

  • There is churn in the work being done. An automation solution for one company shouldn't be expected to work for the next company. Every company has different processes, and developing software to fit the exact needs of one company will continue to be useful.

    Some software is finished, like Microsoft Office 2003, and requires no additional work except to force ads on people. Those jobs may end.

I am about to give my take on software development:

Most [sane] software out there, but not all, has a main development time which is ridiculous compared to its life cycle (you could code in binary machine code, for several ISAs, it would not even matter).

Then, it is extremely hard to justify _HONESTLY_ a permanent income in software development. Really, really hard.

The industry was always a disaster, but if you take a disaster and put AI in the mix you get a disaster at light speed. It was always getting worse and worse but now it’s speedrunning it.

  • Computers are a lever, a force multiplier. If your process is shit, you can shit faster with a computer.

    LLMs are another force multiplier. If your computerized process is a disaster … well, you said it and you’re right.

  • Not so much AI itself, but the billions invested into it and the hardware required - anything you invest billions into but that doesn't have similar return on investment is a fast track to an economic crash / correction. The Y2K tech companies are a previous example, investors were champing at the bit to invest in uh, pets.com or whatever but turns out it didn't / wouldn't earn enough money to earn it back.

    • I agree with the economic argument.

      But I literally mean if you have a crappy business and put AI into it you’re just gonna make your business worse.

      AI as a tool is not actually a solution for very much. AI can mark a good process better but it will also make bad processes way worse.

      It’s a power drill upgrade when you were previously only using a screwdriver, but it’s still not a table saw.

I remember after the Brexit vote (i.e. years before anything actually changed) there was a rash of British companies who clearly would have gone bankrupt anyway blaming their ailing fortunes on Brexit.

There is remarkably little pushback on company narratives about layoffs or ailing economic fortunes from journalists which is weird because it's more normal that they are not truthful.

The Brexit vote is nothing like this though. AI is probably the biggest corporate gaslighting exercise I've ever seen in my entire life.

  • This is a symptom of a shifting media landscape. The tech industry press and the American press generally speaking are stunningly corrupt.

  • There were some remarkable claims made about Brexit. The best I can recall was a bus company closing a local bus route "because of Brexit", an year or two after the vote and therefore years before it actually happened.

    Also very common to blame things on health and safety, GDPR, etc.

Westinghouse Electric Corporation built early electrical grid turbines, and worked extensively with consumer product markets to close the circle of demand. Accordingly, more retail products required more energy, and more energy required more turbines.

The LLM proponents are trying the same naive move with intangible assets, but dismissed finite limits of externalized costs on surrounding community infrastructure. "AI" puts it in direct competition with the foundational economic resources for modern civilization. The technical side just added a facade of legitimacy to an economic fiction.

https://en.wikipedia.org/wiki/Competitive_exclusion_principl...

Thus, as energy costs must go up, the living standards of Americans is bid down. Individuals can't fix irrational movements, but one may profit from its predictable outcome. We look forwards to stripping data centers for discounted GPUs. =3

"Memoirs of extraordinary popular delusions and the madness of crowds" (Charles Mackay, 1852)

https://www.gutenberg.org/files/24518/24518-h/24518-h.htm

The speculative nature of tech has always been present.

The point is mass media communication and frictionless money movements across the world and market access which is so freely availible to the small retail investor.

It's a recipe for disaster because an extraordinary claim can attract billions of dollars with nothing but hope and dreams to back it up.

Imagine if the Wright Brothers had today markets and mass media at their disposal, they'd be showered in billions or even trillions but the actual model didn't make any money because it was R&D

[flagged]

  • Cult members don't want to hear their false god is a liar.

    Apparently 73% of LLM resources are used for emotional context support.

    People need to go outside for a daily walk, and meet real people. Most folks are actually fun to be around. =3

>Europe just became a lower-cost extension of Silicon Valley.

Stay away from europe while you can.

IMO the issue is even more fundamental than the article presents. The software product market is approaching a saturation point. All the low hanging fruit has been commoditized, so buying something off the shelf is now generally preferable to having engineers on your staff. After the pandemic hiring spree, even the big tech companies realized they didn't have enough productive work for all these engineers. The effect is compounded by the return to normal interest rates, so investors are no longer desperate to dump their cash into every random startup idea. Ultimately, software engineers aren't going away, but the era of desperate need is over for good.