Comment by vishnugupta

3 days ago

> removing people and organizational slack

You are spot on w.r.t every assertion you've made. When bean-counters took over the ecosystem they optimised immediate profitability over everything else. Which in turn means, in their mind, every part of the system needs to be firing at 100% all the time. There's no room for experimentation, repair, or anything else.

I've commented about lack of slack on several times here on HN because when I notice a broken system now a days, 90% of it is due to lack of slack in the system to absorb short term shocks.

The problem is, in the minds of these people 'firing at 100% all the time' generally means doing busywork and/or thinking of ways to cheat/manipulate their customers and the market for maximum gain whole delivering minimum value. I would have loved to be 100% engaged working on solving real problems in honest ways at some of my past jobs, but alas MBA/marketing leadership, which has taken over much of tech has very little interest in actually building good things and solving real problems in honest ways.

  • This is what happens when companies become so nepotistic that they only believe in their own bullshit.

    "Can they really breathe fire or did we make that up?"

  • Profit maximization is a continuous process that has generated our high standard of living.

    P.S. I welcome all attempts to prove me wrong!

    • No, the process has impeded even higher standard of living, because it misallocates resources from value generation to value appropriation. It's the extreme short term profit maximization that makes the economy a zero sum game. Otherwise it is not.

      50 replies →

    • I would argue that profit maximization has had very many effects.

      On the one side, it has succeeded at reducing costs, which has indeed given rich societies unprecedented access to consumer goods.

      On the other, it has outsourced from us both jobs and knowledge, which has resulted in higher unemployment and dissatisfaction, with as consequences the political dominoes we see falling internationally. That and the shoddy US health system (which the rest of the world seems to have decided to follow, for some reason).

      And there is the small fact that we're in the process of optimizing the planet to death, and that not-so-rich countries (as well as formerly-rich ones) have starved to death for this high standard of living.

      So, let's appreciate our standard of living, but not assume that it's necessarily a good thing in the grand scheme of things.

      4 replies →

    • "Profit maximization" on its own would have left most people working 12+ hours a day 6 days a week, like it was very common in the 19th century. Luckily, it's never been the only force shaping our societies.

      19 replies →

    • I think it's more accurate to say it is a process that has resulted in our high standard of living faster than other processes... so far.

      There is no guarantee it will keep working for the majority of us going forward; as is becoming very clear all around the world, it also has downsides especially without checks and balances (which was predicted and observed in the past, which is why other processes were conceptualized in the first place!)

      As a trivial example, profit maximization is directly responsible for the enshittification we're seeing everywhere, which definitely is negatively impacting our standard of living.

      16 replies →

  • > generally means doing busywork and/or thinking of ways to cheat/manipulate their customers and the market for maximum gain whole delivering minimum value

    When I read comments like this I can’t help but wonder where people like you work. It’s completely unrepeatable to me. I work with really good people, all the way to the tip, and no try to make money by increasing value for our customers.

    Apple, Google, Walmart, Amazon, Home Depot, Anthropic, Toyota, and a hundred other companies all offer me incredible value for so cheap. Why are people so cynical about a world that offers them unimaginable riches everywhere they look.

    Sure there are bad companies. And if you work at one of those, go get a new job.

    • The parents are talking about things in-the-large, negative societal trends, while you are talking your anecdotal experience and perhaps survival bias striking it so lucky with your employer. The world offers unimaginable riches, but at what cost really? Who benefits most? Where does it lead? Big picture.

    • >Apple, Google, Walmart, Amazon, Home Depot, Anthropic, Toyota, and a hundred other companies all offer me incredible value for so cheap.

      Try to find something with Google these days. Try to use an Apple product past it's planned obsolescence. They crowd out innovation with their monopolistic rent seeking.

I think the bean counters get a bad rap for this a bit unfairly. The past century has seen more progress in knowledge and technology than the rest of human history combined. The world and business environment are changing too rapidly to make longtermist thinking practical.

Few care if you have a lifetime warranty and excellent service or replacement parts if the majority will upgrade in a few years! Mature technologies increasingly become cheaply available as services, eg. laundry, food, transportation. That further reduces demand on production, as many can get by with the bare minimum and don't need the highest quality, longest lasting appliances. Software is even more ephemeral and specialized.

Developing education and training pipelines is wasting money if the skills you need are constantly changing! There is plenty of "slack" in the workforce so this works just fine in most cases - somebody will learn what they need to get paid. There are very few fields where qualified worker shortages are a real problem.

R&D can be outsourced or bought and subsidized by the government in universities, so why do everything yourself? Open source software has even further muddied the waters. Applications have only a limited lifetime before being replicated and becoming free products (this has only been intensified by the introduction of AI), so companies develop services instead.

Technology and knowledge deepening and rapidly becoming more specialized makes the monolithic corporation much less practical, so companies also need to specialize in order to effectively compete. Going too far in the name of efficiency can destroy core competencies, but moving away from the old model was necessary and rational.

  • > R&D can be outsourced or bought and subsidized by the government in universities, so why do everything yourself?

    Because some problems that many companies in very specialized industries work on are so special that outside of this industry, nearly all people won't even have heard about them.

    Additionally, many problems companies have where research would make sense are not the kind of problems that are a good fit for universities.

    • Those fields still develop in-house expertise and world-leadning products. General Electric was cited above, but their turbine engine division is producing the most fuel-efficient, reliable, and lowest TCO aircraft engines there have ever been. The materials science and engineering expertise needed to do this isn't something you can find in a freshly-graduated university student.

      Products like jet engines, though, are still those where quality matters. They are so costly that there's room in the finances to deliver it. Unlike household appliances, where consumers make decisions mostly on the basis of price and being $5 cheaper than the competition is what will get you the sale even if it means using plastic instead of cast or forged metal parts.

      3 replies →

  • Universities dont do product oriented research. They do more general research. And also, they should not do product oriented research, that is companies role.

    And universities research capabilities are being destroyed too right now.

  • > Developing education and training pipelines is wasting money if the skills you need are constantly changing! There is plenty of "slack" in the workforce so this works just fine in most cases - somebody will learn what they need to get paid. There are very few fields where qualified worker shortages are a real problem.

    Here's the problem with your reasoning. This paragraph is simply wrong, with each sentence being untrue. Education and training are never wasted money, the skills aren't changing that quickly, there isn't any slack in the workforce, and qualified worker shortages are being reported in every trade across the board. Someone needs to solve the problems you hand-wave away.

    > this works just fine in most cases - somebody will learn what they need to get paid.

    That's me. I specialize in learning new domains. I cost like 8x more than the random junior you'd be able to hire with a functional onboarding program.

    • > there isn't any slack in the workforce, and qualified worker shortages are being reported in every trade across the board

      Labor force participation is ~62%, far lower than historical peaks. I don't buy it.

      1 reply →

  • "The world and business environment are changing too rapidly to make longtermist thinking practical." Tell that to the Chinese...

I’ll note at the end of the last century I worked at IBM research which had a budget of 6 Billion dollars. Management was trying very hard to get better return on that investment. Even today IBM though often ridiculed in the tech space (sometimes they do deserve it) spends a lot on R&D.

  • Lucent at the same time went through the same issue: how to monetise Bell Labs.

    Bell Labs greatest work came out when AT&T was a monopoly. Once they were broken up (1984?) they started feeling the pain.

    When the Lucent spinoff took place, the new entities had no Monopoly money to fund unconstrained research while management's behaviour never changed.

    I don't know how BL fared under Alcatel and now Nokia, but haven't heard of anything interesting for years.

    • I've been to the Holmdel office in the decline years. It was very sad. A fraction of the former staff was rattling around in what could've been used for a post apocalyptic sci-fi set. In its heyday it must've been magnificent. Imagine taking an entire great research university and putting it into a single architectural masterpiece. I've also been to Nokia HQ after Elop ruined the place. Also sad.

      2 replies →

  • Did anything come out from those billions?

    • > Did anything come out from those billions?

      Per wikipedia:

        IBM employees have garnered six Nobel Prizes, seven Turing Awards,
        20 inductees into the U.S. National Inventors Hall of Fame, 19 National Medals of Technology,
        five National Medals of Science and three Kavli Prizes. As of 2018,
        the company had generated more patents than any other business in each of 25 consecutive years.

      13 replies →

    • Toshiba, IBM and Siemens had a DRAM joint development program 1993-1998. Several generations of DRAM was developed there. Also, while IBM exited the DRAM business, the knowledge survived in Rambus to an extent.

> Which in turn means, in their mind, every part of the system needs to be firing at 100% all the time

Not just that, you have to be always doing less for more gains. Real work is bad work. Shrinkflation good. I don't know what it is if it wasn't a pure scammer mindset.

  • > Which in turn means, in their mind, every part of the system needs to be firing at 100% all the time

    This is a classic Goldratt / Theory of Constraints mistake.

> When bean-counters took over the ecosystem [...] in their mind, every part of the system needs to be firing at 100% all the time.

This is only fair, because they themselves are firing at 100% all the time IYKWIM ;)

I believe private equity ownership represents this in an aggressive form. The 2 and 20 percent takes that PE usually mandates as part of their purchase agreement means that they are highly highly incentivized to maximize short term "wins" over long term survival.

I think Chesterton and Taleb also had pretty reasonable things to say about understanding a system before you make changes and fragile/anti-fragile systems as well.

It’s especially ironic since the bean counters produce no value. I like ‘Developer Hegemony’, even if the title needs changing. The author makes a great case for why information workers produce almost all the value. It’s them that make the profits, yet they’re always a cost center.

It's funny because when I first managed at a public company I was told no employee can work on something more than 80% (and sustained 80% actual work wasn't believable) and if my people were logging 80% or more time to capitalizable projects I would be in trouble.

They also took out all the quality, though in pure business terms one can argue that's a kind of "slack" by itself.

The beancounters have cut all the corners on physical products that they could find. Now even design and manufacturing is outsourced to the lowest bidder, a bunch of monkeys paid peanuts to do a job they're woefully unqualified for.

And the end result is just a market for lemons. Nobody trusts products to be good anymore, so they just buy the cheapest garbage.

Which, inevitably, is the stuff sold directly by Chinese manufacturers. And so the beancounters are hoisted by their own petard.

We've seen it happen to small electronics and general goods.

We're seeing it happen right now to cars. Manufacturers clinging on to combustion engines and cutting corners. Why spend twice the money on a western brand when their quality is rapidly declining to meet BYD models half the price.

---

And we're seeing it happen to software. It was already kind of happening before AI; So much of software was enshittifying rapidly. But AI is just taking a sledgehammer to quality. (Setting aside whether this is an AI problem or a "beancounters push everyone into vibecoding" problem)

E.g. Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there. Windows is just going down in flames. People are jumping ship now.

SaaS is quickly going that way as well. If it's all garbage, why pay for it. Either stop using it or just slop something together yourself.

---

And in the background of this something ominous: Companies can't just pivot back to higher quality after they've destroyed all their inhouse knowledge. So much manufacturing knowledge is just gone, starting a new manufacturing firm in the west is a staffing nightmare. Same story with cars, China has the EV knowledge. And software's going the same way. These beancounters are all chomping at the bit to fire all their devs and replace them with teenagers in the developing world spitting out prompts. They can't move back upmarket after that's done.

Even when the knowledge still lives, when the people with the skills requires have simply moved to other industries and jobs, who's going to come back? Why leave your established job for the former field, when all it takes is the management or executive in charge being replaced by another dipshit beancounter for everyone to be laid off again.

  • > E.g. Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there.

    Desktop Linux has gotten better, though much of the improvement happened decades ago. I believe the first person to prematurely declare "the year of Linux on the desktop" was Dirk Hohndel in 1999: https://www.linux.com/news/23-years-terrible-linux-predictio...

    And speaking as someone who was running desktop Linux in 1999, I remember just how bad it was. Xfce, XFree86 config files, and endless messing around with everything. The most impressive Linux video game of 2000 was Tux Racer.

    But over the next 10 years, Gnome and KDE matured, X learned how to auto-detect most hardware, and more-and-more installs started working out of the box.

    By the mid-2010s, I could go to Dell's Ubuntu Linux page and buy a Linux laptop that Just Worked, and that came with next day on-site support. I went through a couple of those machines, and they were nearly hassle free over their entire operational life. (I think one needed an afternoon of work after an Ubuntu LTS upgrade.)

    The big recent improvement has been largely thanks to Valve, and especially the Steam Deck. Valve has been pushing Proton, and they're encouraging Steam Deck support. So the big change in recent years is that more and more new game releases Just Work on Linux.

    Is it perfect? No. Desktop Linux is still kind of shit. For examples, Chrome sometimes loses the ability to use hardware acceleration for WebGPU-style features. But I also have a Mac sitting on my desk, and that Mac also has plenty of weird interactions with Chrome, ones where audio or video just stops working. The Mac is slightly less shit, but not magically so.

    • > Desktop Linux has gotten better

      This is on me for being a bit too snarky.

      So yes, Desktop Linux has "gotten better". What it hasn't done is solved any of the systemic problems.

      The Open Source development quirks that created the shitshow of the 1999 is still here. Gnome is better but still suffers massively from mainstream features being declared stupid by the maintainers. (A power button that turns off the machine? Heretical.)

      Valve's recent successes are pretty illustrative here. They used their money to directly hijack the projects their products rely on.

      For what it pertains the comparison, Windows is not without this "slow" improvement either. 95 and 98 are lightyears behind contemporary Windows in so many ways. Until quite recently it still made about as much sense to use Linux as it did back then; Not much.

      Take your Linux Laptop example. Sure, Linux finally kind of worked on some specific models that were tested for it. Meanwhile, Windows had moved from "it'll work with some mucking about with drivers" to "It works universally, on practically all hardware". Really, by the mid 2010s Windows would finally be quite tolerant of you changing the hardware.

      Hence my original point; Desktop Linux hasn't really caught up with Windows in any meaningful sense. Windows is just nose-diving into the ground in the last few years.

      3 replies →

    • For some reference back in Ubuntu 6 days around 2005 I switched. It took me 2 weeks to get X Org to run with my nvidia card at the time. 2 weeks of messing with config files. I only persisted because I was so sick of windows.

  • > And in the background of this something ominous: Companies can't just pivot back to higher quality after they've destroyed all their inhouse knowledge. (...) They can't move back upmarket after that's done.

    The knowledge isn't the problem. It can be quickly regained, and progress of science and technology often offer new paths to even better quality, which limits the need for recovering details of old process.

    The actual problem is, there is no market to go up to anymore. Once everyone is used to garbage being the only thing on offer, and adjust to cope with it, you cannot compete on quality anymore. Customers won't be able to tell whether you're honest, or just trying to charge suckers for the same garbage with a nicer finish, like every other brand that promises quality. It would take years of effort and low sales to convince the customers to start believing you're the real deal, which (as beancounters will happily tell you) you cannot afford. And even if you could, how are you going to convince people you're not going to start cutting corners again a few years down the line? In fact, how do you convince yourself? If it happened once, if it keeps happening everywhere around across all economy, it's bound to happen to your business too.

    • Wrong on the first point, right on the second. Institutional knowledge can't be easily regained. To build up the knowledge to, say, make a transistor, you need a bunch of people experimenting with a bunch of things. Published scientific papers and patents will get you part of the way there, but the final stretch is still up to you, including things like which equipment to buy, purity of supplies (and where to get them!), how long the chip needs to be bombarded by each kind of particles, how much air the cleanroom needs to move. All the tiny details. You have to discover them by trial and error. Actual chip manufacturing companies have found themselves unable to get good yield until they copied the floor plan of another working fabrication plant, and they still have no idea why that mattered, but that's an extreme case. Maybe nobody expected miniscule air contamination from one process step was affecting another nearby process step, and in the original plan they were farther apart.

      Yes if you want to wire a neighborhood for internet you can skip DSL and go straight to fiber. That's not the problem. The problem is that nobody in your company knows how deep to put the fiber to minimize problems, how much redundancy is needed, how strong the mechanical armor around the fiber needs to be, how many fibers per cable to meet future capacity needs without excessive costs, which landlords are friendly to you, nobody has the right connections to city hall to get digging permits approved expediently, and so on.

    • > It can be quickly regained

      I'm not sure what you mean with this?

      Sure, hypothetically e.g. any western car manufacturer could poach a bunch of BYD employees. But it's not really practical for most businesses.

      > The actual problem is, there is no market to go up to anymore.

      This is the "Market for Lemons" problem, yes.

      It's less of a problem than you might think. Convincing the entire wider world that you're legitimate is a problem. One made infinitely worse by store marketplaces like Amazon preferring to push "aqekj;bgrsabhghwjbgawrjwsraG" brand garbage.

      So you just don't. The trick is to start small. The smallest you can sustain. (This doesn't work for cars, or anything that's sufficiently complex. You won't be taking on Salesforce.)

      But so long as you can find a market niche where there's demand for quality, you can carve out a living, and from there, scale up.

      The problem with that is twofold: Venture Capital has supplanted other forms of investment and "small business generating single digit millions in revenue" is utterly unappealing to VCs, even though the investment required is downsized accordingly.

      And problem #2: The cost of starting a business is too high right now. Real estate and cost of living just make it unaffordable to even try. + Healthcare if you're in the US.

      1 reply →

  • > Desktop Linux has always been kind of a joke. It hasn't gotten better, the problems are all still there.

    Desktop Linux mostly works these days. It does everything most regular people would want of it, with zero fuss. Including playing games. In some respects, it's easier to use than Mac or Windows.

    When it has trouble with some things, one must remember neither Mac nor Windows is perfect, and they can be extremely frustrating at times.

    Time to update those prejudices!

    • You know, I think you're right. I set up Void Linux on my desktop a couple of years ago, using the plainest base image. I chose that path because ZFS is important to me, and systemd is ugly to me.

      I determined that this minimal start was the right confluence of those two preferences with the howto documentation available at the time, so that's what I did.

      So I'm going along in rolling releases for years, and even playing games fine (except GTA:V online). Things are fine until there's some weird flickering-window bug that rolls down the pipe and apparently only affects nVidia users who use xfce4. It's an awful and particularly jarring bug.

      The workaround is to disable vsync in xfce4, which solves that problem but always causes tearing for me. Even in YouTube videos. (Who knew how reliant we'd become on rote frame synchronization? Why, it seems like just yesterday when I was streaming potato-quality RealVideo episodes of South Park over dialup on a K6-2 box while marveling that any of it worked at all.)

      I don't know which party is responsible for that problem, but nVidia has supposedly fixed it on their end in recent weeks. Which is cool, but this distro isn't shipping that version yet. And I'm reluctant to go off-script -- I loathe the idea of letting the nVidia installer do whatever it wants, and I'm not too keen on building my own package for xbps to handle (hopefully with better grace), either.

      It was time to fix it anyway.

      And I wanted to give Wayland a shot because there's a limitation with SDL that causes the clipboard method used by Factorio to fail to work with large blueprints under X11, so change was already in the air.

      And so it began.

      KDE Plasma: I couldn't get it to work. With both X11 and Wayland, parts of it would just die without leaving any traces I could find. It was unusable. I spent hours troubleshooting it and only a few minutes of actually using it, which is a terrible ratio. It had to go.

      Gnome: It actually worked OK with Wayland. But when I say "worked OK", I mean that it acted like a touchscreen interface -- for toddlers. It didn't crash in mysterious and buried ways, which is good, but it I found it to be an affront to my sensibilities in ways that I simply could not tolerate. I won't apologize for hating it or for feeling insulted by it. That iteration of Gnome is dead to me.

      So working down the list of non-ancient desktop environments: Cinnamon? It works. It's alright. It took some kicking to get sound to work because the mixer it comes with makes it impossible to set up default outputs in a way that behaves here and it reverts the system to its own broken ideas every time it runs, but it responded to my kicks without much of a fight and it works. The xfce4 volume mixer is on the taskbar instead, and I removed all traces of the Cinnamon mixer like it was a cancerous tumor, but with that done: Sound works. Regular X stuff works. It's good enough.

      I haven't tried it with Wayland yet to see I can work around the SDL+Factorio SNAFU, but it's been behaving itself with X11 for a week or more.

      ---

      Now, that may sound awful. And to be clear, it wasn't fun at all. But at least it's not like my Windows laptop. I've got stories about that, for sure.

      One of things that sticks out right now is when that laptop would deplete its battery just sitting in my bag in the car. That was weird, but it became more urgent to fix it when I got the machine out of the bag and it was hot.

      The cause for that was an HP printer driver (for a rather old color printer that I don't even own -- I do use it sometimes, but it's 25 miles away) that was periodically waking the machine from hibernation to check that printer's supply status, so it could try to sell me more stuff.

      This task was so completely buried in Windows that it took hours to find it, and it was configured by its installer to wake the machine from hibernation -- including, specifically, while on battery. Because that's obviously what every user of any printer needs: Computers that turn themselves on using battery power to sell toner cartridges while hidden unseen inside of a bag in the back of a car.

      It didn't have to be that way, but it was this way anyway. I consider that kind of thing to be deliberate in a fashion that extends beyond mere maliciousness: It is instead simply fucking evil.

      ---

      So yeah, Linux is a great desktop.

      I'd like to propose a new slogan: "Linux. At least it's not deliberately evil. Usually."

      1 reply →

  • I think you’re not blaming political leadership enough. NAFTA, and other programs were always going to lead to the state of affairs we have now. This was a choice. Blaming greed is like blaming gravity.

  • > E.g. Desktop Linux has always been kind of a joke

    And yet I run it every day, and it's by FAR the most enjoyable platform and tooling to use (for me).

Engineers seem to think business people don’t know what they are doing, but if your post were true, then companies would add slack to outperform their competitors.

The broken system likely doesn’t have enough business impact to justify the investment to maintain it.

  • Adding slack works over years.

    Cutting slack gets you quarterly bonuses.

    When you plan working 3-5 years in a single company you don’t care if it crashes and burns month after you leave just to burn down next one.

    Conversely we see the same dynamic with engineers, they build stuff to prop up their CV and don't care if company still supports crap they did after they leave.

  • It's a measurement problem, which engineers also fall prey to, perhaps even more.

    It's the danger of data driven decision making. Cutting people and resources right now gets you a measurable gain. Not cutting them gets you a gain tomorrow.

    But, that gain is unmeasurable! Because in order to measure it you would need to know what happens in an alternate universe where you cut those people. So, if you're only making data driven decisions, you would cut the people 100% of the time.

    But that's why companies aren't run by algorithms, they're run by people. The algorithm would run the company into the ground.

  • > companies would add slack to outperform their competitors.

    I think if they did this they'd get buried by the market. Your slack is someone else's opportunity to undercut you. It's a systemic problem, it's in every individual's self interest to work towards instability.

  • This would be true if everyone was optimizing for the same thing.

    It's not terribly difficult to imagine someone optimizing for, say, a bonus at the end of the year.

> optimised immediate profitability over everything else

Which is the usual complaint that businesses are focused on short term results, sacrificing long term results.

If that would be generally true, the stock market would be going down steeply, not up, as stock prices are based on expectations of future profits.

  • Are stock market profit expectations mostly long term? Stock markets have been wrong before.

    Besides that, the U.S. stock market went up over several decades while manufacturing capabilities were transferred overseas. That has had, and will continue to have, domestic ramifications that might not be captured by investor profits.

  • > as stock prices are based on expectations of future profits.

    I thought stock prices were based one what I thought I could sell it for next week.

> Without people who have actually worked with the system, you end up with a loss of tacit knowledge—and eventually, declining productivity.

> You are spot on w.r.t every assertion you've made.

Huh? What happened to the concept of "debate" on HN. It's just a bunch of people agreeing with each other. Yet the data doesn't support any of OP's thesis.

Here's a chart of the rise in productivity per hour worked in the United States since 1947. It's a steady linear increase every single year: https://fred.stlouisfed.org/series/OPHNFB

Yours is the type of story big company workers tell themselves to feel important while refusing to learn anything new and never taking any risks. But the truth is 99.999% of companies are not doing anything that unique or complex. Most companies are not ASML.

If I had a nickel for every time I've heard someone justify their do-nothing position within a giant bureaucracy while saying the phrase "institutional knowledge" I'd be rich. This is just a sign of a poorly run giant company full of engineers building esoteric and overly complex in-house solutions to already-solved problems as job security.

The truth is all of this "institutional knowledge" is worthless in the face of disruption, and it has a half life that's getting shorter every day.

Everybody talks shit about global just-in-time supply chains and specialization...but just because we had a fake toilet paper shortage for a few months during a 100-year global pandemic doesn't mean running things like it's 1947 for the last 70 years would have been better. You enjoy a much higher quality of life today due to these "evil" JIT supply chains which it turns out are far more durable than people want to claim.

  • Most measurements measured in dollars are just stealth measurements of inflation. Even inflation adjusted measurements, because official inflation metrics are always lowball numbers with shady methodology.

  • US aggregate productivity metrics fail to address this nuance. There is a fundamental difference in abstraction layers between a macro-system becoming more efficient and an individual enterprise experiencing operational failure. As a software engineer, distinguishing between these layers is critical. Your argument is akin to claiming that because the Google Play Store sees a higher volume of app releases (increased productivity), the intrinsic quality of individual apps has naturally improved.

    In this analogy, the individual app represents a company, and the Play Store represents the broader US market. Silicon Valley’s highly liquid labor market allows talent to flow freely, which opens up and elevates the baseline of the overall market. However, that is entirely distinct from the fact that individual companies are suffering severe drops in internal quality and productivity.

    Furthermore, in software architecture, 'productivity' and 'quality' are rarely directly proportional. With AI coding tools, we can ship an app orders of magnitude faster. Historically, it took me three months to write 60,000 lines of code; recently, I am generating that same volume in just two weeks. My productivity has undeniably spiked, but can I confidently claim the code quality is better than when I manually scrutinized every single line?

    The real issue is not whether the broader economy has grown more productive since 1947. The core issue is whether a specific organization bleeds capability when the exact people who understand its real-world constraints, failure modes, and operational history walk out the door.

    Both realities can co-exist: National productivity can trend upwards, while individual companies simultaneously suffer operational regressions due to botched migrations, failed refactors, or the loss of tacit knowledge.

    I agree that 'institutional knowledge' is sometimes weaponized to defend unnecessary complexity. However, the opposite fallacy is treating all localized, domain-specific knowledge as worthless. While some of it is merely job-security folklore, the rest is literally the only surviving documentation of why the system functions in the first place