← Back to context

Comment by JohnMakin

3 days ago

Plenty of people see it - but, to a hiring team, a junior is an extremely risky investment. They demand a high cost relative to when they can start contributing actual value, may not work out, or may hop ship the moment they become competent. It is rational for a business to want to eliminate this risk. It's possible that everyone is acting rationally here, knowing it will lead to a result that is not favorable down the line - because the immediate benefit is too great to consider the latter.

In other words the gamble of hiring expensive juniors with shiny degrees is greater to them than the gamble of not having competent seniors a few years down the line. And that risk may be overblown - people are still hiring some juniors, it's not like it has stopped entirely - so future seniors will likely just be worth more than they are currently. To some, that may be worth the risk, especially if you believe AI will continue to get stronger.

I am not saying I agree with this decision making, more pointing out the thought process. We have had to have similar discussions where I am but are still hiring juniors, FYI. That's basically all we're hiring right now, actually, because the market for strong juniors is very good right now.

It's not an economic decision, it's a cultural one. Are you investing to build something useful and sustainable? Or are you exploiting for a profitable quarter?

I read someone compare the mindset to that of a drug-dealer. In any given neighborhood, a handful of people get very wealthy, at the expense of the stability and potential of everyone else. Our elite are drug-dealers - literally, in some cases. And conditions are deteriorating about how you'd expect.

  • Besides the “its not x, its y” llm smell here, no, comments like this are also part of the hype, just the other side of it. The fact that LLM tooling can replace a lot of tedium typically set aside for junior’s is hardly disputable at this point.

    • Okay. And you could also still hire the juniors and have them oversee the LLMs, interrogate them about how well they understand the principles and technical details of what they're having the LLM do, correct them when they're wrong, try to get them to explore other approaches or extend the rote approach or synergize with some other task, etc. You know, training. Like companies used to do (or so I hear, such initiatives having been long gone by the time I hit the workforce).

      The fact that you won't isn't a productivity, bottom-line decision, as we've already established that the business is trading efficiency now for incompetence later; the financials are a wash, at best. It's a cultural decision to throw your youth under the bus for seniors and shareholders' short-term interests. The best you could say is, "Well, of course. This has been a common narrative across the American economic landscape for the past 30ish years. 'F* them kids,' is the rule."

      >Besides the “its not x, its y” llm smell here

      Kindly fuzakenna off plx.

and what happens in half a generation or so when those seniors start retiring? the only way software production will meet demand is if the fewer seniors out there are propped up by way more competent ai than we have now. that also means the work will fundamentally change from being massively nerdy to moderately nerdy with the ability to work with ai. many of the people in the computer industry now just wont be attracted to that type of work.. what will they do? become physicists or mathematicians? and what type of person is tomorrows senior software developer?

edit: maybe todays computer nerds will become tomorrows backyard hackers, the only ones able to beat the ai.

> In other words the gamble of hiring expensive juniors with shiny degrees is greater to them than the gamble of not having competent seniors a few years down the line.

I mean, writing the code which makes mon^H^H^H^H provides value for minimum cost is the ultimate goal of a software company, but any competent CS grad or anyone with basic algorithms knowledge knows that greedy algorithms can't solve all problems. Sometimes the company needs to look ahead, try, fail and backtrack.

Nerdy analogies aside, self-sabotaging whole sector with greedy shortsightedness is a pretty monumental misstep. It's painful yet unbelievably hilarious at the same time. Pure dark comedy.

  • The problem is that it's systemic. The entire system rewards the short term thinking, so that even people with some awareness of what's happening tend to contribute to it all. People are fantastically good at finding reasons to work at places like OpenAI, Anthropic, Google, Meta, Palantir, X, etc. And once they're there, they similarly figure out how to justify the actions they're taking.