Comment by simonsarris

7 months ago

I think the real white collar bloodbath is that the end of ZIRP was the end of infinite software job postings, and the start of layoffs. I think its easy to now point to AI, but it seems like a canard for the huge thing that already happened.

just look at this:

https://fred.stlouisfed.org/graph/?g=1JmOr

In terms of magnitude the effect of this is just enormous and still being felt, and never recovered to pre-2020 levels. It may never. (Pre-pandemic job postings indexed to 100, its at 61 for software)

Maybe AI is having an effect on IT jobs though, look at the unique inflection near the start of 2025: https://fred.stlouisfed.org/graph/?g=1JmOv

For another point of comparison, construction and nursing job postings are higher than they were pre-pandemic (about 120 and 116 respectively, where pre-pandemic was indexed to 100. Banking jobs still hover around 100.)

I feel like this is almost going to become lost history because the AI hype is so self-insistent. People a decade from now will think Elon slashed Twitter's employee count by 90% because of some AI initiative, and not because he simply thought he could run a lot leaner. We're on year 3-4 of a lot of other companies wondering the same thing. Maybe AI will play into that eventually. But so far companies have needed no such crutch for reducing headcount.

IMO this is dead on. AI is a hell of a scapegoat for companies that want to save face and pretend that their success wasn't because of cheap money being pumped into them. And in a world addicted to status games, that's a gift from the heavens.

  • ZIRP is an American thing? In that case maybe we could try comparisons with the job markets in other developed Western countries that didn't have this policy. If it was because of ZIRP, then their job markets should show clearly different patterns.

    • ZIRP was a central banking thing, not just an American phenomenon. At least in the tech industry, the declines we're seeing in job opportunities are a result of capital being more expensive for VCs, meaning less investments are made (both in new and existing businesses), meaning there's less cash to hire and expand with. It just felt like the norm because ZIRP ran more or less uninterrupted for 10 years.

      You're right that we should see comparisons in other developed countries, but with SV being the epicenter of it all, you'd expect the fallout to at least appear more dramatic in the U.S.

      And an overwhelming number of (focusing exclusively on the U.S.) tech "businesses" weren't businesses (i.e., little to no profitability). At best they were failed experiments, and at worst, tax write-offs for VCs.

      So, what looked like a booming industry (in the literal, "we have a working, profitable, cash-flowing business here" sense) was actually just companies being flooded with investment cash that they were eager to spend in pursuit of rapid growth. Some found profitability, many did not.

      Again, IMO, AI isn't so much the cause as it is the bandage over the wound of unprofitability.

    • There isn’t anything magically about precisely zero percent interest rates; the behavior we see is mostly a smooth extension of slightly higher rates, which the EU was at.

      And of course ZIRP was pioneered in Japan, not the US.

Such an important point, I've seen and suspected the end of ZIRP being a much much greater influence on white collar work than we suspect. AI is going to take all the negative press but the flow of capital is ultimately what determines how the business works, which determines what software gets built. Conway's law 101. The white collar bloodbath is more of a haircut to shed waste accumulated during the excesses of ZIRP.

  • AI also happens to be a perfect scapegoat: CEOs who over-hired get to shift the blame to this faceless boogeyman, and (bonus!) new hires are more desperate/willing to accept worse compensation.

  • ZIRP and then the final gasp of COVID bubble over hiring.

    At least in my professional circles the number of late 2020-mid 2022 job switchers was immense. Like 10 years of switches condensed into 18-24 months.

    Further lot of experiences and anecdotes talking to people who saw their company/org/team double or triple in size when comparing back to 2019.

    Despite some waves of mag7 layoffs we are still I think digesting what was essentially an overhiring bubble.

  • Is it negative press for AI, or is it convincing some investors that it’s actually causing a tectonic shift in the workforce and economy? It could be positive in some sense. Though ultimately negative, because the outcomes are unlikely to reflect a continuation of the perceived impact or imaginary progress of the technology.

Also section 174’s amortization of software development had a big role.

  • I agree, R&D change is what triggered 2022 tech layoffs. Coders used to be free, all this play with Metaverse and such was on public dime. As soon as a company had to spend real money, it all came crashing down.

    • This is a weird take. Employees are supposed to be business expenses, that's the core idea of running a business: profit = revenue - expenses, where expenses are personnel / materials, and pay taxes over profit. Since the R&D change, businesses can't fully expense employees and need to pay (business) taxes over their salaries. Employees - of course - still pay personal taxes also (as was always the case).

      5 replies →

What's happening now is similar to what happened during the 2000's "dot-com bubble burst". Having barely survived that time, I saw this one coming and people told me I was crazy when I told them to hold on to their jobs and quit job-hopping, because the job-hopper is very often the first one to get laid off.

In 2000 I was moved cities and I had a job lined-up at a company that was run by my friends, I had about 15 good friends working at the company including the CEO, and I was guaranteed the job in software development at the company. The interview was supposed to be just a formality. So I moved, and went in to see the CEO, and he told me he could not hire me, the funding was cut and there was a hiring freeze. I was devastated. Now what? Well I had to freelance and live on whatever I could scrape together, which was a few hundred bucks a month, if I was lucky. Fortunately the place I moved into was a big house with my friends who worked at said company, and since my rent was so low at the time, they covered me for a couple of years. I did eventually get some freelance work from the company, but things did not really recover until about 2004 when I finally got a full-time programming job, after 4 very difficult years.

So many tech companies over-hired during covid, there was a gigantic bubble happening with FAANG and every other tech company at the time. The crash in tech jobs was inevitable.

I feel bad for people who got left out in the cold this time, I know what they are going through.

  • Those are some great friends. Aside from job hoppers, I noticed there are a lot of company loyalists getting canned too though (i.e worked at MSFT 10 years)

    • It's not exactly the same this time around, the dot-com bubble was a bit different, but both then and now were preceded by huge hiring bubbles and valuations that were stupid. Now it's a little different 25 years later, tech has advanced and AI means cutting the fat out of a lot of companies, even Microsoft.

      AI is somewhat creating a similar bubble now, because investors still have money, and the current AI efforts are way over-hyped. 6.5 billion paid to aquihire Jony Ive is a symptom of that.

Keynes suggested that by 2030, we’d be working 15 hour workweeks, with the rest of the time used for leisure. Instead, we chose consumption, and helicopter money gave us bullshit jobs so we could keep buying more bullshit. This is fairly evident by the fact when the helicopter money runs out, all the bullshit jobs get cut.

AI may give us more efficiency, but it will be filled with more bullshit jobs and consumption, not more leisure.

  • Keynes lived in a time when the working class was organized and exerting its power over its destiny.

    We live in a time that the working class is unbelievably brainwashed and manipulated.

    • > Keynes lived in a time when the working class ...

      Keynes lived in a time when the working class could not buy cheap from China... and complain that everybody else was doing the same!

    • He was extrapolating, as well. Going from children in the mines to the welfare state in a generation was quite something. Unfortunately, progress slowed down significantly for many reasons but I don’t think we should really blame Keynes for this.

      > We live in a time that the working class is unbelievably brainwashed and manipulated.

      I think it has always been that way. Looking through history, there are many examples of turkeys voting for Christmas and propaganda is an old invention. I don’t think there is anything special right now. And to be fair to the working class, it’s not hard to see how they could feel abandoned. It’s also broader than the working class. The middle class is getting squeezed as well. The only winners are the oligarchs.

      31 replies →

    • It is very possible that foreign powers use AI to generate social media content in mass for propaganda. If anything, the internet up to 2015 seemed open for discussion and swaying by real people’s opinion (and mockery of the elite classes), while manipulation and manufactured consent became the norm after 2017.

      8 replies →

    • He also lived in a time when the intense importance and function of a moral and cultural framework for society was taken for granted. He would have never imagined the level of social and moral degeneration of today.

      I will not go into specifics because the authoritarians still disagree and think everything is fine with degenerative debauchery and try to abuse anyone even just pointing to failing systems, but it all does seem like civilization ending developments regardless of whether it leads to the rise of another civilization, e.g., the Asian Era, i.e., China, India, Russia, Japan, et al.

      Ironically, I don’t see the US surviving this transitional phase, especially considering it essentially does not even really exist anymore at its core. Would any of the founders of America approve of any of America today? The forefathers of India, China, Russia, and maybe Japan would clearly approve of their countries and cultures. America is a hollowed out husk with a facade of red, white, and blue pomp and circumstance that is even fading, where America means both everything and nothing as a manipulative slogan to enrich the few, a massive private equity raid on America.

      When you think of the Asian countries, you also think of distinct and unique cultures that all have their advantages and disadvantages, the true differences that make them true diversity that makes humanity so wonderful. In America you have none of that. You have a decimated culture that is jumbled with all kinds of muddled and polluted cultures from all over the place, all equally confused and bewildered about what they are and why they feel so lost only chasing dollars and shiny objects to further enrich the ever smaller group of con artist psychopathic narcissists at the top, a kind of worst form of aristocracy that humanity has yet ever produced, lacking any kind of sense of noblesse oblige, which does not even extend to simply not betraying your own people.

      4 replies →

  • If you work 15 hours/week then presumably someone who chose to work 45 hours/week would make 3x more money.

    This creates supply-demand pressure for goods and services. Anything with limited supply such as living in the nice part of town will price out anyone working 15 hours/week.

    And so society finds an equilibrium…

    • Presumably the reduction to a 15 hour workweek would be much the same as the reduction to the 40 hour workweek - everyone takes the same reduction in total hours and increase in hourly compensation encoded in labor laws specifically so there isn't this tragedy of the commons.

      2 replies →

  • I think something Keynes got wrong there and much AI job discussion ignores is people like working, subject to the job being fun. Look at the richest people with no need to work - Musk, Buffett etc. Still working away, often well past retirement age with no need for the money. Keynes himself, wealth and probably with tenure working away on his theories. In the UK you can quite easily do nothing by going on disability allowance and doing nothing and many do but they are not happy.

    There can be a certain snobbishness with academics where they are like of course I enjoy working away on my theories of employment but the unwashed masses do crap jobs where they'd rather sit on their arses watching reality TV. But it isn't really like that. Usually.

    • The reality of most people is that they need to work to financially sustain themselves. Yes, there are people who just like what they do and work regardless, but I think we shouldn't discount the majority which would drop their jobs or at least work less hours had it not been out of the need for money.

      10 replies →

    • What percentage of people would you say like working for fun? Would you really claim they make up a significant portion of society?

      Even myself, work a job that I enjoy building things that I’m good at, that is almost stress free, and after 10-15 years find that I would much rather spend time with my family or even spend a day doing nothing rather than spend another hour doing work for other people. the work never stops coming and the meaninglessness is stronger than ever.

      3 replies →

    • Meanwhile your examples for happy working are all billionaires who do w/e tf they want, and your example of sad non working are disabled people.

  • Not to undercut your point - because you’re largely correct - but this is my reality. I have a decent-paying job in which I work roughly 15 hrs a week. Sometimes more when work scales up.

    That said, I’m not what you’d call a high-earning person (I earn < 100k) I simply live within my means and do my best to curb lifestyle creep. In this way, Keynes’ vision is a reality, but it’s a mindset and we also have to know when enough wealth is enough.

    • You're lucky. Most companies don't accept that. Frequently, even when they have part time arrangements, the incentives are such that middle managers are incentivized to squeeze you (including squeezing you out), despite company policies and HR mandates.

      7 replies →

    • I'm working hard on this one. I'm down to a three-day week, and am largely keeping the boundaries around those other four.

      It came about late last year when the current employer started going getting gently waved off in early funding pitches. That resulted in some thrash, forced marches to show we could ship, and the attendant burnout for me and a good chunk of the team I managed. I took a hard look at where the company was and where I was, and decided I didn't have another big grind in me right now.

      Rather than just quit like I probably would have previously, I laid it out to our CEO in terms of what I needed: more time taking care of my family and myself, less pressure to deliver impossible things, and some broad idea of what I could say "no" to. Instead of laughing in my face, he dug in, and we had a frank conversation about what I _was_ willing to sign up for. That in turn resulted in a (slow, still work-in-progress) transition where we hired a new engineering leader and I moved into a customer-facing role with no direct reports.

      Now I to work a part-time schedule, so I can do random "unproductive" things like repair the dishwasher, chaperone the kid's field trip, or spend the afternoon helping my retired dad make a Costco run. I can reasonably stop and say, "I _could_ pay someone to do that for me, but I actually have time this week and I can just get it done" and sometimes I...actually do, which is kind of amazing?

      ...and it's still fucking hard to watch the big, interesting decisions and projects flow by with other people tackling them and not jump in and offer to help. B/c no matter what a dopamine ride that path can be, it also leads to late nights and weekends working and traveling and feeling shitty about being an absentee parent and partner.

  • Most of the people are leisuring af work (for keynes era standards) and also getting paid for it

  • > Keynes suggested that by 2030, we’d be working 15 hour workweeks, with the rest of the time used for leisure.

    I suspect he didn't factor in how may people would be retired and on entitlements.

    We're not SUPER far from that now, when you factor in how much more time off the average person has now, how much larger of percentage of the population is retired, and how much of a percentage is on entitlements.

    The distribution is just very unequal.

    I.E. if you're the median worker, you've probably seen almost no benefit, but if you're old or on entitlements, you've seen a lot of benefits.

  • > Keynes suggested that by 2030, we’d be working 15 hour workweeks

    Most people with a modest retirement account could retire in their forties to working 15-hour workweeks somewhere in rural America.

    • The trade is you need to live in VHCOL city to earn enough and have a high savings rate. Avoid spending it all on VHCOL real estate.

      And then after living at the center of everything for 15-20 years be mentally prepared to move to “nowhere”, possibly before your kids head off to college.

      Most cannot meet all those conditions and end up on the hedonic treadmill.

      2 replies →

  • Keynes also convinced us that high unemployment and high inflation couldn't happen at the same time. This was proven wrong in the early 1970s.

  • It's more likely 15% of the workforce will have jobs. They'll be working eighty hour weeks and making just enough to keep them from leaving.

  • Now one has to work 60 hours to afford housing(rent/mortgage) and insurance (health, home, automotive). Yes, food is cheap if one can cook.

  • > Keynes suggested that by 2030, we’d be working 15 hour workweeks

    Yeah, I'd say I get up to 15 hours of work done in a 40 hour workweek.

  • "Bullshit jobs" are the rubbish required to keep the paperwork tidy, assessed and filed. No company pays someone to do -nothing-.

    AI isn't going to generate those jobs, it's going to automate them.

    ALL our bullshit jobs are going away, and those people will be unemployed.

    • > "Bullshit jobs" are the rubbish required to keep the paperwork tidy, assessed and filed.

      It's also the jobs that involve keeping people happy somehow, which may not be "productive" in the most direct sense.

      One class of people that needs to be kept happy are managers. What makes managers happy is not always what is actually most productive. What makes managers happy is their perception of what's most productive, or having their ideas about how to solve some problem addressed.

      This does, in fact, result in companies paying people to do nothing useful. People get paid to do things that satisfy a need that managers have perceived.

    • AI is going to 10x the amount of bullshit, fully automating the process.

      NONE of the bullshit jobs are going away, there will simply be bigger, more numerous bullshit.

  • Keynes was talking about work in every sense,including house chore. We're well below 15 hours of house chores by now, so that part became true.

    • Washing machines created a revolution where we could now expend 1/10th of the human labour to wash the same amount of clothes as before. We now have more than 10 times as much clothes to wash.

      I don’t know if it’s induced demand, revealed preference or Jevon’s paradox, maybe all 3.

      7 replies →

    • We've got 10 whole hours left over for "actual" work!

      (Quotes because I personally have a significantly harder time doing bloody housework...)

Why would you interpret data cut off at 2020 so that you're just looking at a covid phenomenon? The buttons don't seem to do anything on that site, but why not consider 2010-2025?

That said, the vibe has definitely shifted. I started working in software in uni ~2009 and every job I've had, I'd applied for <10 positions and got a couple offers. Now, I barely get responses despite 10x the skills and experience I had back then.

Though I don't think AI has anything to do with it, probably more the explosion of cheap software labor on the global market, and you have to compete with the whole world for a job in your own city.

Kinda feels like some major part of the gravy train is up.

As of now yes. But we are still in day 0.1 of GenAI. Do you think this will be the case when o3 models are 10x better and 100x cheaper? There will be a turning point but it’s not happened yet.

  • Yet we're what? 5 years into "AI will replace programmers in 6 months"?

    10 years into "we'll have self driving cars next year"

    We're 10 years into "it's just completely obvious that within 5 years deep learning is going to replace radiologists"

    Moravec's paradox strikes again and again. But this time it's different and it's completely obvious now, right?

    • I basically agree with you, and I think the thing that is missing from a bunch of responses that disagree is that it seems fairly apparent now that AI has largely hit a brick wall in terms of the benefits of scaling. That is, most folks were pretty astounded by the gains you could get from just stuffing more training data into these models, but like someone who argues a 15 year old will be 50 feet tall based on the last 5 years' growth rate, people who are still arguing that past growth rates will continue apace don't seem to be honest (or aware) to me.

      I'm not at all saying that it's impossible some improvement will be discovered in the future that allows AI progress to continue at a breakneck speed, but I am saying that the "progress will only accelerate" conclusion, based primarily on the progress since 2017 or so, is faulty reasoning.

      16 replies →

    • As far as I've seen we appear to already have self driving vehicles, the main barriers are legal and regulatory concerns rather than the tech. If a company wanted to put a car on the road that beetles around by itself there aren't any crazy technical challenges to doing that - the issue is even if it was safer than a human driver the company would have a lot of liability problems.

      14 replies →

    • 100% this. I always argue that groundbreaking technologies are clearly groundbreaking from the start. It is almost a bit like a film, if you have to struggle to get into it in the first few minutes, you may as well spare yourself watching the rest.

  • We’re already heading toward the sigmoid plateau. The GPT 3 to 4 shift was massive. Nothing since had touched that. I could easily go back to the models I was using 1-2 years ago with little impact on my work.

    I don’t use RAG, and have no doubt the infrastructure for integrating AI into a large codebase has improved. But the base model powering the whole operation seems stuck.

    • > I don’t use RAG, and have no doubt the infrastructure for integrating AI into a large codebase has improved

      It really hasn't.

      The problem is that a GenAI system needs to not only understand the large codebase but also the latest stable version of every transitive dependency it depends on. Which is typically in the order of hundreds or thousands.

      Having it build a component with 10 year old, deprecated, CVE-riddled libraries is of limited use especially when libraries tend to be upgraded in interconnected waves. And so that component will likely not even work anyway.

      I was assured that MCP was going to solve all of this but nope.

      2 replies →

    • > I could easily go back to the models I was using 1-2 years ago with little impact on my work.

      I can't. GPT-4 was useless for me for software development. Claude 4 is not.

      2 replies →

  • I use LLM’s daily and live them but at the current rate of progress it’s just not really something worth worrying about. Those that are hysterical about AI seem to think LLM’s are getting exponentially better when in fact diminishing returns are hitting hard. Could some new innovation change that? It’s possible but it’s not inevitable or at least not necessarily imminent.

    • I agree that the core models are only going to see slow progression from here on out, until something revolutionary happens... which might be a year from now, or maybe twenty years. Who knows.

      But we are going to see a huge explosion in how those models are integrated into the rest of the tech ecosystem. Things that a current model could do right now, if only your car/watch/videogame/heart monitor/stuffed animal had a good working interface into an AI.

      Not necessarily looking forward to that, but that's where the growth will come.

  • How are we in 0.1 of GenAI ? It's been developed for nearly a decade now.

    And each successive model that has been released has done nothing to fundamentally change the use cases that the technology can be applied to i.e. those which are tolerant of a large percentage of incoherent mistakes. Which isn't all that many.

    So you can keep your 10x better and 100x cheaper models because they are of limited usefulness let alone being a turning point for anything.

  • How does it work if they get 10x better in 10 years ? Everything else will have already moved on and the actual technology shift will come from elsewhere.

    Basically, what if GenAI is the Minitel and what we want is the internet.

  • 10× better by what metric? Progress on LLMs has been amazing but already appears to be slowing down.

    • with autonomous vehicles, the narrative of imperceptibly slow incremental change about chasing 9's is still the zeitgeist despite an actual 10x improvement in homicidality compared to humans already existing.

      There is a lag in how humans are reacting to AI which is probably a reflexive aspect of human nature. There are so many strategies being employed to minimize progress in a technology which 3 years ago did not exist and now represents a frontier of countless individual disciplines.

      14 replies →

  • Frankly, we don't know. That "turning point" that seemed so close for many tech, never came for some of them. Think 3D-printing that was supposed to take over manufacturing. Or self-driving, that is "just around the corner" for a decade now. And still is probably a decade away. Only time will tell if GenAI/LLMs are color TV or 3D TV.

    • > Think 3D-printing that was supposed to take over manufacturing.

      3D printing is making huge progress in heavy industries. It’s not sexy and does not make headlines but it absolutely is happening. It won’t replace traditional manufacturing at huge scales (either large pieces or very high throughput). But it’s bringing costs way down for fiddly parts or replacements. It is also affecting designs, which can be made simpler by using complex pieces that cannot be produced otherwise. It is not taking over, because it is not a silver bullet, but it is now indispensable in several industries.

      1 reply →

    • >Think 3D-printing that was supposed to take over manufacturing

      This was never the case, and this is obvious to anyone who has ever been to factories that doing mass-produced plastic

      >Or self-driving, that is "just around the corner" for a decade now.

      But it is really around the corner, all that remains is to accept it. That is, to start building and modifying the road infrastructure and changing the traffic rules to enable effective integration self-driving cars into road traffic.

      1 reply →

  • > 5 years into "AI will replace programmers in 6 months"?

    Programmers that don't use AI will get replaced by those that do. (no just by mandate, but by performance)

    > 10 years into "we'll have self driving cars next year"

    They're here now. Waymo does 250K paid rides/week.

  • There's a lot of "when" people are betting on, and not a lot of action to back it. If "when" is 20 years, then I still got plenty career ahead of me before I need to worry about that.

  • > Do you think this will be the case when o3 models are 10x better and 100x cheaper?

    why don't you bring it up then.

    > There will be a turning point but it’s not happened yet.

    do you know something that rest of us don't ?

ZIRP had little to do with it. Tech is less levered than any other major industry. What happened is that growth expectations for large tech companies were way out of line with reality and finally came back down to earth when the market finally realized that the big tech cos are actually mature profitable companies and not just big startups. The fact that this happened at the same time ZIRP ended is a coincidence.

Saw something similar the other day. X was awash with stories that IBM was laying off several thousand people in their HR dept. being let go due to Ai. Then over the course of the day the story shifted to IBM was outsourcing them all to India. Was a very interesting transition, seemed intentional.

  • IBM seemed to outsource recruiting to Indian firms too and it's awful. The accounts who contact me on LinkedIn are grossly unprofessional and downright nasty.

> because he simply thought he could run a lot leaner

Because he suddenly had to pay interest on that gigantic loan he (and his business associates) took to buy Twitter.

It may not be the only reason for everything that happened, but it sure is simple and has some very good explanatory powers.

  • Other companies have different reasons to cut costs, but the incentive is still there.

    • Stocks are valued against the risk free interest, or so the saying goes.

      Doubling interest rate from .1% to .2% does a lot for your DCF models, and in this case we went from zero (or in some cases negative) to several percentage units. Of course stock prices tanked. That's what any schoolbook will tell you, and that's what any investor will expect.

      Companies thus have to start turning dials and adjust parameters to make number go up again.

FRED continues to amaze me with the kind of data they have availab.e

  • That's from Indeed. And, Indeed has fewer job postings overall [https://fred.stlouisfed.org/series/IHLIDXUS]. Should we normalize the software jobs with the total number of Indeed postings? Is Indeed getting less popular or more popular over this time period? Data is complicated

    • Look at that graph again. It's indexed to 100 in Feb 1, 2020. It's now at 106. In other words, after all the pandemic madness, the total number of job postings on indeed is slightly larger than it was before, not smaller.

      But for software, it's a lot smaller.

> People a decade from now will think Elon slashed Twitter's employee count by 90% because of some AI initiative, and not because he simply thought he could run a lot leaner.

That part is so overblown. Twitter was still trying to hit moonshots. X is basically in "keep the lights on" mode as Musk doesn't need more. Yeah, if Google decides it doesn't want to grow anymore, it can probably cut it's workforce by 90%. And it will be as irrelevant as IBM in maximum 10 years.

  • What moonshots has Twitter gone for in the last decade? Feature velocity is also higher since the acquisition.

    • "Moonshots" was probably a bad term. Twitter devs used to be very active in open source, in Scala, actors, etc in particular. Fairly sure that's all dead. From most reports the majority of current Twitter devs are basically visa-shackled to the company.

Macroeconomic policy always changes, recessions come and go, but it's not a permanent change in the way e-commerce or AI is.

ZIRP jobs, n., jobs the compensation for which is derived from zero interest loans, often in the form of venture capital, instead of reserves, profits or other sources

"When interest rates return to normal levels, the ZIRP jobs will disappear." -- Wall Street analyst

Honestly, if anything i think AI is going to reverse the trend. Someone is going to have to be hired to clean up after them.

  • I think they said that about outsourcing software dev jobs. The reality is somewhere in the middle. extreme cases will need cleanup but overall it's here to stay, maybe with more babysitting.

    • I think the reality is Lemon Market Economics. We'll sacrifice quality for price. People want better quality but the truth is that it's a very information asymmetric game and it's really hard to tell quality. If it wasn't, we could all just rely on Amazon reviews and tech reviewers. But without informed consumers, price is all that matters even if it creates a market nobody wants.

  • Thats the impression I got. Things overall get just worse in quality because people rely too much on low wages and copy pasting LLM answers

    • I think that's true in software development. A lot of the focus is on coding because that's really the domain of the people interested in AI, because ultimately they ARE software. But the killer app isn't software, it's anything where the operation is formulaic, but the formula can be tedious to figure out, but once you know it you can confirm that it's correct by working backwards. Software has far too many variables, not least of which is the end user. On the other hand things like accounting, finance, and engineering are far more suitable for trained models and back testing for conformity.

The flaw with the Zirp narrative that companies managed to raise more money than ever before the moment they had a somewhat believable narrative instead of the crypto/web3/metaverse nonsense.

Always disheartening how much people forget and tolerate the underlying deliberate human absurdity that created these events.

Almost no one has seen a world where the price of money wasn't centrally planned, a committee of experts deciding it based on gut feel like they did in command economies like the Soviet union.

And then thousands of people's lives are disrupted as the interest rate swings wildly due purely to government action (corona lockdowns and fed zirp response), and it all somehow just ends up people talking about AI instead.

The true wrongdoers get absolutely no consequences, and we all just carry on like there's no problem. Often because our taxes go to paying hordes of academics and economists to produce layers and layers of sophisticated propaganda that of course this system is the best one.

Absurd and shitty world.

It's simply the old Capital vs Labor struggle. CEOs and VCs all sing in the same choir, and for the past 3 years the tune is "be leaner".

p.s.: I'm a big fan of yours on Twitter.

  • Except Labor in Tech is unique in that it has zero class consciousness and often actively roots for their exploiters.

    If we were to unionize, we could force this machine to a halt and shift the balance of power back in our favor.

    But we don't, because many of us have been brainwashed to believe we're on the same side as the ones trying to squeeze us.

    • I think the issue at play here is the quickly changing job descriptions, RSU's and the higher paid bunch benefiting from very unequal pay across a job category.

  •   > the tune is "be leaner".
    

    Seems like they're happy to start cutting limbs to lose weight. It's hard to keep cutting fat if you've been aggressively cutting fat for so long. If the last CEO did their job there shouldn't be much fat left

    • > If the last CEO did their job there shouldn't be much fat left

      funny how that fat analogy works...because the head (brain) has a lot more fat content than muscles/limbs.

      1 reply →

    • yet this will continue until it grounds to a halt.

      It's amazing and cringy the level of parroting performed by executives. Independent thought is very rare amongst business "leaders".

      3 replies →

That inflection point seems to more specifically start at the day of the new administration's inauguration.

It’s a shame that this is the top comment because it’s backward looking (“here’s why white-collar workers lost their jobs in the last year”) instead of looking forward and noticing that even if interest rates are reduced back to zero these jobs will not be performed by humans ever again. THAT is the message here. These workers need to retrain and move on.

  • > even if interest rates are reduced back to zero these jobs will not be performed by humans ever again

    It's not like companies laid off whole functions. These jobs will continue to be performed by humans - ZIRP just changes the number of humans and how much they get paid.

    > These workers need to retrain and move on.

    They only need to "retrain" insofar as they keep up with the current standards and practices. Software engineers are not going anywhere.

> unique inflection near the start of 2025

I wonder what happened in January 2025...

[flagged]

  • Trump didnt kick off the layoffs.

    It was the war with Russia that drove the fed to raise interest rates in 2022 - a measure that was intended to curb inflation triggered by spikes in the prices of economic inputs (gas, oil, fertilizer, etc.).

    The tech layoffs started later that year.

    Widespread job cuts are an intended effect of raising interest rates - more unemployed = less spending = keeps a lid on inflation.

    AI is just cashing in on the trend.

    • "War with Russia" sounds like someone willingly started that war, and Russia was the target.

      Of course, nothing is further from the truth. "Russian invasion of Ukraine" is what should be written there.

      23 replies →

    • 2 trillion in unnecessary Covid related spending when Covid impact was winding down was the key reason for inflation. "$2000 checks!" was the campaign slogan

      10 replies →

    • I don’t believe the war specifically drove the Fed to raise interest rates. Inflation and asset prices have risen sharply a year prior to the war.

      3 replies →

    • Raising interest rates has nothing to do with the 2022 war. If it did, rates would have come back down. Interest rate increases don't help with supply/demand driven price spikes. They do help with money supply and aggregate demand driven inflation, which was the cause of our recent inflation (that started way before Russia invaded Ukraine). The war was a convenient excuse because it deflects responsibility.

      And remember when they first said inflation was "transitory" and caused by supply chain issues from the economy reopening after covid? They didn't raise interest rates then because, like I mentioned above, interest rates don't help with supply shocks. If they did, the Fed would have raised rates then.

    • Anecdotally, I detected a cooling starting in March of 2022.

      Was actively looking at this time for months prior and it went from a few recruiters a day reaching out to a few a week.

    • You are wrong, Trump's 2017 Tax cut bill had a provision that kicked in that caused the layoffs. Engineers became more expensive because now companies had to amortize their costs over 5 years instead of immediately.

    • There is no proof that higher interest rates lead to greater unemployment. In fact, macro employment kind of boomed during the referenced period. I'd posit that higher rates actually boosted macro employment stats . Why ? Because higher rates = higher income to rich people via interest income channel = higher fed budget deficits ( gov is net payer of interest) = higher GDP = lower unemployment ceterus paribus.

      1 reply →

Elon Musk experiment is the worst anchor that can be used for comparison since the dude destabilized Twitter (re-branding, random layoffs, etc...). I'd be more interested in companies that went leaner but did it in a sane manner. The Internet user base grew between 2022 and now but Twitter might have lost users in that time period and certainly didn't make any new innovations beyond trying to charge its users more and confusing them.