At the company where I work (one of the FAANGs), there is suddenly a large number of junior IC roles opening up. This despite the trend of the last few years to only hire L5 and above.
My read of the situation:
- junior level jobs were sacrificed as cost cutting measures, to allow larger investment in AI
- some analysts read this as “the junior levels are being automated! Evidence: there is some AI stuff, and there are no junior roles!”
- but it was never true, and now the tide is turning.
I’m not sure I ever heard anybody in my company claim that the dearth of junior openings was due to to “we are going to automate the juniors”. I think all of that narrative was external analysts trying to read the tea leaves too hard. And, wannabes like Marc Benioff pretending to be tech leaders, but that’s a helpful reminder that Benioff is simply “not serious people”.
In addition the industry has been going through a massive correction post Covid, and all the free money drying up. Any impact AI is having is all mixed up with that.
The expectations for juniors, and how seniors work with them, will certainly change, but it's way too early to be making doomsday predictions.
Of course, that's easy for me to say when I'm not the one who just spent thousands of dollars and 4 years of their to land in an environment where getting a job is going to be challenging to say the least.
Free money did not dry up. I genuinely couldn’t imagine the upteenth saas or data startup trend continuing for another several decades towards the end of the last one. There was almost nothing left to build until AI …
There were symptoms of it right here on HN. Lots of fiddling around with framework churn, not really building anything compelling. The 2010s were not inspiring in that regard, and I personally felt like it was an unworthy field … until AI reinvigorated it (speaking entirely from a creative standpoint).
Agree, the death of the junior SWE is greatly exaggerated. (At least in FAANG)
Maybe there was some idea that if AI actually solved software engineering in a few years you wouldn't need any more SWEs. Industry is moving away from that idea this year.
The death, maybe, but not the lack of hiring. At $BIGCORP, where I work, I haven't seen an externally hired junior dev in at least 2 years in an extended team of ~100 people.
I agree that AI wasn't genuinely replacing junior roles to any important extent,
and the larger investment in AI is spot on.
Fast Company had exactly this take in November in "AI isn’t replacing jobs. AI spending is".
https://www.fastcompany.com/91435192/chatgpt-llm-openai-jobs...
"We’ve seen this act before. When companies are financially stressed, a relatively easy solution is to lay off workers and ask those who are not laid off to work harder and be thankful that they still have jobs. AI is just a convenient excuse for this cost-cutting. "
At my company, we're actively lowering our off-shore dev count in favor or on-shore devs. We're small but we're growing so we're hiring about one junior dev a year. This alone doesn't mean anything, but adding another data point to the conversation.
That narrative never sat right with me. That all these companies decided that AI was going to replace humans suddenly? Just an obvious pit to fall in to and one that conveniently feeds the AI is taking your job meme. Your read makes MUCH more sense.
I'm with you on this, though I do think some people are true believers. Say a lie enough times, right?
But a big part of it to me is looking at the job data[0]. If you look at devs during this period you can see that during the pandemic they hired more in early to mid 2022 but currently are lower than any other industry.
Tech loves booms and busts, with hiring and everything else. But more than anything the tech industry loves optics. The market has rewarded the industry for hiring during the pandemic and in the past year it has rewarded them for laying people off "because AI". And as the new year comes around they'll get rewarded for hiring again as they "accelerate development" even more. Our industry is really good at metric hacking and getting those numbers to keep going up. As long as it looks like a good decision then people are excited and the numbers go up.
I think the problem is we've perverted ("over optimized") the market. You have to constantly have stock growth. The goal is to become the best but you lose the game by winning. I think a good example of this is from an article a read a few months ago[1]. It paints AWS in a bad light but if you pull out the real data you'll see AWS had a greater increase in absolute users than GCloud (you can also estimate easily from the article). But with the stock market it is better to be the underdog with growth than the status quo with constant income[2].
What a weird way to optimize our businesses. You are rewarded for becoming the best, but you are punished for being the best. Feels like only a matter of time before they start tanking on purpose because you can't go up anymore, so you need to make room to go up[3]. I mean we're already trading on speculation. We're beyond tech demos pushing stock up (already speculative) and now our "demos" are not even demonstrations but what we envision tech that hasn't been built to look like. That's much more speculative than something that is in beta! IDK, does anyone else feel like this is insane? How far can we keep pushing this?
[2] Doesn't take a genius to figure out you'll make more money had you invested $100 in GCloud vs $100 in AWS (in this example). The percentile differential is all that matters. Being percentile punishes having a large existing userbase. You have double the percentile growth going from 1 user to 100 than from 10 million to 500 million, yet any person who isn't severely mentally incapacitated would conclude the latter is a better business.
[3] Or at least play a game of hot potato. Sounds like a collusion ring in waiting. e.g. AWS stagnates, lets Azure take a bunch of users, Azure stagnates and users switch to AWS. Gives both the ability to "grow" and I'm sure all the users will be super happy with constantly switching and all the extra costs of doing so...
FAANG has shedded between 81,000 and 87,000 workers in the past 5 years; I suspect a significant chunk of these jobs aren't coming back.
Seems to me the companies are mostly in a holding pattern: sure, if an important project needs more bodies, it's probably okay to hire. I suspect that lots of teams have to make do until further notice.
Are some teams using AI instead of hiring junior engineers? I don't think there's any doubt about that. It's also a trial period to better understand what the value-add is.
Based on listening to engineers on various podcasts, almost all of them describe the current level of AI agents as being equivalent to a junior engineer: they're eager and think they know a lot but they also have a lot to learn. But we're getting closer to the point where a well-thought out Skill [1] can do a pretty convincing job of replacing a junior engineer.
But—at the rate AI is improving, a company that doesn't adopt AI for software engineering will be at a competitive disadvantage compared to its peers.
Meta (Facebook)
2022: ~11,000 employees (13% of workforce)
2023: ~10,000 employees plus 5,000 open positions eliminated
2024: Multiple smaller rounds totaling ~100-200 employees
2025: ~3,600 employees (5% of workforce, performance-based cuts)
Total: Approximately 24,700-25,000 employees
Amazon
2022: ~10,000 employees
2023: ~17,000 employees (split between multiple rounds)
2024: Smaller targeted cuts
2025: ~14,000 employees announced
Total: Approximately 41,000+ employees
Google (Alphabet)
2023: ~12,000 employees (6% of workforce)
2024: Multiple smaller rounds, hundreds of employees
2025: Several hundred in Cloud division and other areas
Total: Approximately 15,000-20,000 employees
Apple
Apple has been an outlier among FAANG companies:
2022-2023: Minimal layoffs (hiring freeze instead)
2024: ~700+ employees (primarily from canceled Apple Car project and microLED display teams)
2025: Small cuts in sales and other divisions
Total: Approximately 800-1,000 employees (significantly less than peers)
Netflix
2022: ~450 employees across two rounds (150 + 300)
2023: Smaller targeted cuts in animation and drama divisions
2024-2025: Minimal additional cuts
Total: Approximately 500-600 employees
Overall FAANG Totals
Across all five companies over the past 5 years: approximately 81,000-87,000 workers
have been laid off, with the vast majority occurring in 2022-2023
during the post-pandemic correction period.
> Based on listening to engineers on various podcasts, almost all of them describe the current level of AI agents as being equivalent to a junior engineer: they're eager and think they know a lot but they also have a lot to learn. But we're getting closer to the point where a well-thought out Skill [1] can do a pretty convincing job of replacing a junior engineer.
The people that comment as such are either so disconnected from the software development process or so bought in on the hype that they are forgetting what the point of a junior role is in the first place.
If you hire a junior and they're exactly as capable as a junior 3 years later (about how far we're in now) many organizations would consider letting that employee go. The point of hiring a junior is that you get a (relative to the market) cheap investment with a long-term payoff. Within 1-2 years if they are any good, they will not be very junior any more (depending on domain, of course). There is no such promise or guarantee with AI, and employing an army of junior engineers that can't really "learn" is not a future I want to live in as a mid-career senior-ish person.
Of course, you can say "oh, it'll improve, don't worry" but I live in the present and I simply do not see that. I "employ" a bunch of crappy agents I have to constantly babysit only to output more work "units" I could before at the cost of some quality. If I had spent the money on a junior I would only have to babysit for the first little while and then they can be more autonomous. Even if they can improve beyond this, relying on the moat of "AI" provider companies to make this happen is not exactly comfortable either.
One of those companies sent me a linkedin request with a specific role. I searched on their website and there were very few roles and none like the one they proposed. So they are doing stealth hiring now. You can't let the world know you need people when you are supposed to use AI.
It doesn't help that a lot of the graduates I've talked to or interviewed seemed to treat a compsci degree as nothing more than a piece of paper they needed to get to be handed a high paying tech job. If you're motivated enough to learn enough job skills to be useful on your own then I guess you can treat your degree that way. But if you got through 4 years through cheating and minmaxing the easiest route possible and wound up with no retained skills to show for it? Congrats, you played yourself and fell for the "college is useless" meme. Coulda just skipped the student loans and bombed interviews without the 4 year degree.
> Coulda just skipped the student loans and bombed interviews without the 4 year degree.
I think college is useless for the ones out there whom already know how to code, collaborate and other skills the industry is looking for. Many out there are developing high level projects on GitHub and other places without having any degree.
Also, most of the stuff you learn in college has absolutely no relation to what you will do in the industry.
Personally, I disagree. Software engineering encompasses a lot more than frontend dev work. In previous engineering positions, I’ve used linear regression, evolutionary computation, dynamic algorithms, calculus, image processing, linear algebra, circuit design, etc. almost all of which I originally learned as part of my computer science degree.
Just because you won't use it doesn't mean it's not useful. Lots of programmers use math. Lots of programmers use DSA knowledge on a daily basis - and if you aren't you're probably writing bad code. I see a lot of O(n^2) code or worse making apps slow for no reason. Pretty basic stuff that most people don't understand despite taking a whole class on it.
Sure I learned lots of stuff I've never used. Like relational algebra. But I also learned lots of stuff I use a lot, and it's unlikely I'd have studied most of that stuff on my own. During my degree I also had time and opportunity to pursue lots of other topics outside the mandated course material, you're not limited to what they force you to learn.
So sure if you have the motivation, discipline and resourcefulness to learn all that stuff on your own go right ahead. Most people aren't even close. Most people are much better off with a degree.
It has happened several times - junior web devs can't find jobs, junior Java devs can't find jobs, etc... usually after a surge wave in the related tech area. We had large overall surge in tech around Covid time, and as usually there is some adjustment now.
The dotcom bubble had comp sci lecture halls with students overflowing into the hallway. I don’t blame people, it’s migratory. Jobs and resources are there, so, go there.
Then we blame the other group of students for not going there and picking majors where the jobs aren’t.
We need some kind of apprenticeship program honestly, or AI will solve the thing entirely and let people follow their honest desires and live reasonably in the world.
I can confirm from consulting experience that India is where the jobs went. My office provides professional services to North American and European industrial customers in manufacturing and distribution. Roughly 85% of these customers have fully Indian IT teams. Running a SOQL query in our Salesforce instance for 'Devi', 'Singh', and 'Kumar' yields over two thousand hits across client contacts, even.
Since the workers are hired for cost over quality, they're typically incompetent. Though many have learned to parasitize SME and support staff expertise by asking highly specific questions in an extended sequence. It's a salami-slicing strategy where the majority of the work ends up being performed by those SMEs and support staff while the incompetent workers collect the paychecks and credit. I'm pushing my teams to more aggressively identify and call out this behavior, but it's so systemic that it's an endless battle with every new project coming in the door.
Personal frustrations aside, it's very dangerous from both economic and national security perspectives for India to be building and administering so much of the West's IT infrastructure. Our entire economy depends on it, yet we're voluntarily concentrating that dependency in a single foreign nation. A Pacific conflict alone could sever us from the majority of our IT workforce, regardless of India's intentions.
Currently looking for a new role in biotech and it seems like at many companies it is almost 40:1 india vs united states roles being posted. This is in R&D not even manufacturing.
If the US actually cared about retaining jobs for the people they would enforce ratios of onshore / offshore with heavy taxes if companies did not reach that ratio.
Companies don't want to pay US salaries, cost of living in the US are not going down, costs of engineering talent in India is cheaper, you can hire 2 devs for the cost of 1 US dev. Why would you ever have any US engineering devs?
It won't change organically unless the costs of India engineers goes up or the costs of US engineers goes down.
Saying that "we're firing to use AI" makes you look like you have ROI on your AI investments and you're keeping up.
In fact there are possibly other macro-economic effects at play:
1. The inability to deduct engineering for tax purposes in the year they were spent: "Under the Tax Cuts and Jobs Act (TCJA) from 2017, the law requires companies to amortize (spread out) all domestic R&D expenses, including software development costs, over five years, starting in tax years after December 31, 2021, instead of deducting them immediately. This means if you spend $100,000 on software development in 2023, you can only deduct 1/5th (or $20,000) each year over five years"
2. End of zero-interest rates.
3. Pandemic era hiring bloat - let's be honest we hired too many non-technical people, companies are still letting attrition take place (~10%/yr where I am) instead of firing.
4. Strong dollar. My company is moving seats to Canada, Ireland, and India instead of hiring in the US. Getting 1.5-2 engineers in Ireland instead of 1 senior on the US west coast.
Otherwise AI is an accelerator to make more money, increase profits and efficiency. Yes it has a high cost, but so does/did Cloud, every SaaS product we've bought/integrated.
No it's not. There is no shortage of tech problems to solve and there are no tech jobs that AI can do alone.
AI is sucking up investment and AI hype is making executives stupid. Hundreds of billions of dollars that used to go towards hiring is now going towards data centers. But AI is not doing tech jobs.
These headlines do nothing but increase the hype by pointing towards the wrong cause entirely.
It all depends on how you prompt. and the prompt system you’ve setup.. when done well, you just “steer” the code /system. Quite amazing to see it come together. But there are multiple layers to this.
It seems you've registered this account a couple of months ago only to basically repeat this opinion over and over (sprinkled with some anti-science opinions on top).
great engineering effort was spent to make software at FAANG built on clear service oriented modular architectures, and thus easy to develop for. Add to that good organization of process where engineers spend most of their time doing actual dev work.
Enterprise software is different beast - large fragile [quasi]monoliths, good luck for [current] AI to make a meaningful fixes and/or feature development in it. And even if AI manages to speed up actual development multiple times, the impact would be still small as actual development takes relatively small share of overall work in enterprise software. Of course it will come here too, just somewhat later than at places like FAANG.
Entry level jobs have been getting wiped out for at least 5 years, including tech jobs, which includes 2 years that not even ChatGPT 3.5 was available. That was the first version that would reasonably respond to any useful question. And if you're being honest, other entry level jobs are far worse of than tech jobs. Entry-level bakers ... outright don't really exist anymore.
Even agentic computing (ie. an AI doing anything on it's own accord for tech-savy users, never mind average users) is new from this year. I would argue it's still pretty far from widespread. Neither my wife nor my kids, despite my explaining repeatedly, even know what that is, never mind caring.
I'm repeating the mantra from before, and I get that it's not useful. But no, it's not AI wiping out entry-level jobs. It's governments failing to prop up the economy.
On the plus side, this means it can be fixed. However, I very much doubt the current morons in charge are going to ...
I’d go farther and guess that the tech job market would be even worse today without every company with at least 500 headcount (and many smaller than that), whether a tech company or not, putting money into “AI initiatives”.
I don’t think we’ve seen any amount of a net drop in tech jobs on account of LLMs (yet). I actually think they’re (spending on projects using them, that is) countering a drop that was going to happen anyway due to other factors (tightening credit being a huge one; business investment hesitation due to things like batshit crazy and chaotic handling of tariffs; consumer sentiment; et c)
Reminds me of that comic where the dog runs a ball up to his owner with the thought bubble "Throw!" When the owner goes to take the ball, the dog steps back, thinking, "No take! Only throw!"
So in the glorious future, well only need senior devs to manage AI. No juniors! Only seniors!
Unfortunately if it takes you 4 years to significantly upskill in tech, you are learning way too slow to survive in this industry. Most of the major innovators I know are dropouts, because they realized college is suited to train you to work in academia, where very few jobs exist, almost no one worth working for cares about degrees anymore, and the debt only makes surviving harder.
IMO the best education and credentials come from picking interesting projects you have no idea how to do, then learn everything in your way to ship them as open source so potential employers can see your work.
If you can get a degree on a scholarship for free, wonderful, but college should be viewed as more of a hobby or a way to network, rather than a way of obtaining marketable technical skills.
I would agree FAANGs are an exception who have historically hired almost exclusively academics who hire other academics, and it shows. They let many coast for years at a time and get away with being a specialist unable to deliver value outside of their specialization or rapidly learn new skills. Many get the job with academic success and treat the job as a continuation of their academic career. Many get "tenure" and can do whatever they want and are effectively paid to just not work for competitors.
I know lots of people working at those orgs that brag about how well they get away with doing nothing of value and we all know these people (but of course not everyone is like that).
No offense but I do not feel the overwhelming majority of roles at these companies are delivering value to humanity apart from shareholders, or something most people should aspire towards in a career, and do not think most of the skills learned in these orgs are all that useful in the world outside those walls.
Also those same FAANGs are clearly aware of the above at some level and doing mass layoffs, or not replacing people who leave, and those workers are having a really hard time finding a home in the non-FAANG working world where they are expected to be highly motivated generalists.
Outsourcing, end of ZIRP, end of R&D tax credit.
Macro-economic conditions are pushing companies to do more with fewer people. AI might be helping with this, but it's pure marketing BS to blame it for the state of tech employment.
Computer science is not getting wiped out by AI. Entry level jobs exist, though people may need to reset their expectations (note that median job being $80k) from getting a $150k job out of college - that was always the exception rather than the average.
There are average jobs out there that people with a "want to be on the coast and $150k" or "must be remote so I don't relocate" are thumbing their nose at.
It would be justified if AI were actually the cause, but this article does nothing to prove that. The only "tech jobs" that can even demonstrate direct replacement are call-center type roles. Everything else is just loosely blamed on AI, which is a convenient scapegoat as billions of dollars of investment are redirected from hiring to building data centers.
>I think the numbers you are arguing with here are for all employees, not just fresh graduates.
If you click through to new york fed's website, the unemployment figures are 4.8% for "recent college graduates (aged 22-27)", 2.7% for all college graduates, and 4.0% for all workers. That's elevated, but hardly "wiping out".
At $113B, 2019 was the third-highest year on record for VC deal volume.
2019 had the second-highest volume of “mega rounds” ($100M deals or greater)–mega rounds represented 44% of total annual deal volume.
Revenue grew by an average of 12.2% in 2019 and the total revenues of the tech giants was greater than the GDP of four of the G20 nations.
Yes, tech hiring in 2025 is down from 2019. That's a lot like saying "tech hiring is down from 2000" in 2003.
> What happens when there are no more entry-level humans to be promoted to mid-level, and so on?
No business cares about that question, just like the Onceler didn't care how many Truffula trees were left. It's not their problem. Business is business, and business must grow, regardless of crummies in tummies, you know.
It even has a name, tragedy of the commons. I have been saying it constantly for the last few years with all this AI hype over LLM's going on. But with business focus really narrowing down to short time frames, what do you expect
The "business" doesn't care about this, but individual employees care about their job duties, not their business. And some of them do have a job duty where they care about this.
(i.e. this cynical complaint is exactly the opposite of the cynical complaint about managers/directors engaging in empire building.)
Well looked at what has always happened in society when young people have no hope for the future: massive societal disruption mostly in the forms of revolution + violence.
Since this isn't the 1800s anymore there won't be any major revolutions but I expect way more societal violence going forward. If you have no hope for the future it's not hard to go to very dark paths quickly, usually through no fault of your own sadly.
Now add how easy it is for malicious actors to get an audience and how LLM tech makes this even easier to do. Nice recipe for a powder keg.
I don't see why you're being downvoted. Aside from being a little inflammatory your premise is correct.
It's not a secret companies do not want to hire Americans. Americans are expensive, demand too many benefits like fair pay, healthcare, and vacations. They also are (mostly) at-will. H1B solves all these problem. When that doesn't work, there's 400 Infosys-likes available to export that labor cheaply. We have seen this with several industries, the last most prominent one being auto manufacture.
All that matters is that the next quarters earnings are more than the last. No one hates the American worker more than Americans. Other countries have far better worker protections than us.
I see no reason H1B couldn't be solved by having an high barrier to entry (500k one time fee) and maintenance (100k per year). Then, force them to be paid at the highest bracket in their field. If H1Bs are what it's proponents say - necessary for rare talent not found else where - then this fee should be pennies on the value they provide. I also see no reason we can't tax exported labor in a similarly extreme manner. If the labor truly can't be found in America the high price of the labor on tax and fee terms should be dwarfed by their added value.
If it is not the case that high fees and taxes on H1B and exported labor make sense then the only conclusion is the vast majority of H1Bs and exported labor are not "rare talent" and thus aren't necessary. They can come through the normal immigration routes and integrate into the workforce as a naturalized American.
What exactly are the normal immigration routes? Employment-based immigration (H1B) is the only avenue that makes sense for a skilled worker. And usually skilled immigrants are the ones a country wants to attract.
Interesting. At least some of this has to be the bullwhip effect modeled with employers as retail, universities as suppliers, and graduating students as further back suppliers. The 4 year lead time in production of employable labour causes a whip crack backwards through the supply chain when there is a sudden shift at the retail end.
It's true that a lot of things which were once junior contributor things are now things I'd rather do, but my scarce resource is attention. And humans have a sufficiently large context window and self-agentic behaviour that they're still superior to a machine.
The 'recent graduates' quoted in this article all seem to be from (for lack of a better description) 'developing countries' hoping to get a (again, generalizing) 'high-paying FAANG job'.
My initial reaction would be that these people, unfortunately, got scammed, and that the scammers-promising-abundant-high-paying-jobs have now found a convenient scapegoat?
AI has done nothing so far to reduce the backlog of junior developer positions from where I can see, but, yeah, that's all in "Europoor" and "EU residency required" territory, so what do I know...
For the last few decades its been offshoring that filled the management agenda in the way AI does today so it doesn't seem surprising to me that the first gap would be in the places you might offshore a testing department to, etc.
Offshoring has the exact same benefits/problems that AI has (i.e: it's cheap, yet you have to specify everything in excruciating detail) and has not been a significant factor in junior hiring, like, ever, in my experience.
Currently helping with hiring and can't help but reflect on how it changed over past couple of years. We are now filtering for much stronger candidates across all experience levels, but junior side of the scale had been affected much more. Where previously we would take top 5% of junior applicants that made it past first phone screen, now it's below 2%.
"This article was amended on 26 June 2025 to clarify that the link between AI and the decline in graduate jobs is something suggested by analysts, rather than documented by statistics"
Plus, that decline seems specious anyway (as in: just-about visible when you only observe the top-5% of the chart), plus, the UK job market has always been very different from the EU-they-left-behind.
Again, in my experience, that simply never happened, at least not with regard to junior positions.
During COVID we were struggling to retain good developers that just couldn't deal with the full-remote situation[1], and afterwards, there was a lull in recent graduates.
Again, this is from a EU perspective.
[1] While others absolutely thrived, and, yeah, we left them alone after the pandemic restrictions ended...
Maybe rather than telling everyone to "learn to code" we could have told them to do jobs they are more suited to doing: serving food, nursing, construction etc. all which have tangible benefits to society.
When I went to Japan, it felt like all kinds of people were doing all kinds of jobs many hours into the day, whether it is managing an arcade, selling tickets at the station, working at a konbini or whatever small job. Maybe we need to not give such lofty ideas to the new generation and represent blue collar jobs as "foreigner" or "failure" jobs.
For that to work, we would first need to make those blue collar jobs into ones that actually pay well enough for people to thrive instead of merely survive
Both the economy and culture play a role in the types of jobs people aim for. Tech used to be for "nerds" now it is "cool" because they watched The Social Network and Bill Gates told them they are smart enough to be part of the club.
Japan doesn't do that. Those part time jobs don't pay very well. There's just much lower overhead to having them, you don't have to own a car or a giant house to be in commute range etc.
Yeah. Where are all the great new Mac native apps putting electron to shame, avalanche of new JS frameworks, and affordable SaaS to automate more of life? AI can write decent code, why am I not benefiting from that a consumer?
Neura Link, Quantum computers are making interesting milestones with Microsoft releasing a processor chip for Quantum computing. Green steel is another interesting one, though not as 'sexy' as the previous two.
Uhhh, LLMs? The shit computers can do now is absurd compared to 2020. If you showed engineers from 2020 Claude, Cursor, and Stable Diffusion and didn't tell them how they worked their minds would be fucking exploding.
This article asserts 7 times that jobs are being replaced by AI and the only data to substantiate it is a link to an EY report that is paywalled, doesn't hold up to the text of the link, and doesn't hold up to what contemporary journalists wrote about the report.
I am not a hiring manager. But if there is more supply of non-junior workers. Doesn't it make sense to actually hire those if compensation might even be lower than before? You hire juniors to support them, or because experienced devs are too expensive or there simply aren't enough of them. If there are enough of them on market for more reasonable price wouldn't actually choosing from that cohort make more sense?
I had the privilege of working with a great SWE intern this year. Their fresh ideas and strong work ethic made a real impact. Experienced engineers need this kind of energy.
Yes many over-rely on LLMs, but new engineers see possibilities we've stopped noticing and ask the questions we've stopped asking. Experience is invaluable, but it can quietly calcify into 'this is just how things are done.'
Everyone loves blaming AI for entry-level woes, but check the numbers: CS grads hit 6.1% unemployment while nursing sits at 1.4%. That's not "wiping out" jobs, that's oversupply meeting picky hiring after years of "learn to code" hype.
AI is eating the boring tasks juniors used to grind: data cleaning, basic fixes, report drafts. Companies save cash, skip the ramp-up, and wonder why their mid-level pipeline is drying up. Sarcastic bonus: great for margins, sucks for growing actual talent.
Long term though, this forces everyone to level up faster. Juniors who grok AI oversight instead of rote coding will thrive when the real systems engineering kicks in. Short term pain, massive upside if you adapt.
Basic coding to solve simple problems is something that high schoolers and even bright middle schoolers can do. By the time I was in college I had been coding for most of a decade. Part of the issue is that many of the folks coming out of school started learning this stuff WAY too late.
It's like if you waited until college to start learning to play piano, and wonder why you can't get a job when you graduate. You need a lot of time at the keyboard (pun intended) to build those skills.
- very few teams have headcount, or expecting to grow
- the number of interview requests get has dropped off a cliff.
So BigTech is definitely hiring less IMHO.
That said, I am not sure if it's only or even primarily due to replacement by AI. I think there's generally a lot of uncertainty about the future, and the AI investment bubble popping, and hence companies are being extra cautious about costs that repeat (employees) vs costs that can be stopped whenever they want (buying more GPUs).
And in parallel, they are hoping that "agents" will reduce some of the junior hiring need, but this hasn't happened at scale in practice, yet.
I would expect junior SWE hiring to slowly rebound, but likely stabilize at a slower pace than in the pre-layoff years.
I only want to point out that evidence of less hiring is not evidence for AI-anything.
As others have pointed out, here and previously, things like outsourcing to India, or for Europe Eastern Europe, is also going strong. That's another explanation for less jobs "here", but they are not gone, they just moved to cheaper places. As has been going on for decades, it just continues unevenly.
I’m not sure if this is true.
At the company where I work (one of the FAANGs), there is suddenly a large number of junior IC roles opening up. This despite the trend of the last few years to only hire L5 and above.
My read of the situation:
- junior level jobs were sacrificed as cost cutting measures, to allow larger investment in AI
- some analysts read this as “the junior levels are being automated! Evidence: there is some AI stuff, and there are no junior roles!”
- but it was never true, and now the tide is turning.
I’m not sure I ever heard anybody in my company claim that the dearth of junior openings was due to to “we are going to automate the juniors”. I think all of that narrative was external analysts trying to read the tea leaves too hard. And, wannabes like Marc Benioff pretending to be tech leaders, but that’s a helpful reminder that Benioff is simply “not serious people”.
In addition the industry has been going through a massive correction post Covid, and all the free money drying up. Any impact AI is having is all mixed up with that.
The expectations for juniors, and how seniors work with them, will certainly change, but it's way too early to be making doomsday predictions.
Of course, that's easy for me to say when I'm not the one who just spent thousands of dollars and 4 years of their to land in an environment where getting a job is going to be challenging to say the least.
Free money did not dry up. I genuinely couldn’t imagine the upteenth saas or data startup trend continuing for another several decades towards the end of the last one. There was almost nothing left to build until AI …
There were symptoms of it right here on HN. Lots of fiddling around with framework churn, not really building anything compelling. The 2010s were not inspiring in that regard, and I personally felt like it was an unworthy field … until AI reinvigorated it (speaking entirely from a creative standpoint).
Agree, the death of the junior SWE is greatly exaggerated. (At least in FAANG)
Maybe there was some idea that if AI actually solved software engineering in a few years you wouldn't need any more SWEs. Industry is moving away from that idea this year.
The death, maybe, but not the lack of hiring. At $BIGCORP, where I work, I haven't seen an externally hired junior dev in at least 2 years in an extended team of ~100 people.
4 replies →
I agree that AI wasn't genuinely replacing junior roles to any important extent, and the larger investment in AI is spot on. Fast Company had exactly this take in November in "AI isn’t replacing jobs. AI spending is". https://www.fastcompany.com/91435192/chatgpt-llm-openai-jobs...
"We’ve seen this act before. When companies are financially stressed, a relatively easy solution is to lay off workers and ask those who are not laid off to work harder and be thankful that they still have jobs. AI is just a convenient excuse for this cost-cutting. "
At my company, we're actively lowering our off-shore dev count in favor or on-shore devs. We're small but we're growing so we're hiring about one junior dev a year. This alone doesn't mean anything, but adding another data point to the conversation.
That narrative never sat right with me. That all these companies decided that AI was going to replace humans suddenly? Just an obvious pit to fall in to and one that conveniently feeds the AI is taking your job meme. Your read makes MUCH more sense.
I'm with you on this, though I do think some people are true believers. Say a lie enough times, right?
But a big part of it to me is looking at the job data[0]. If you look at devs during this period you can see that during the pandemic they hired more in early to mid 2022 but currently are lower than any other industry.
Tech loves booms and busts, with hiring and everything else. But more than anything the tech industry loves optics. The market has rewarded the industry for hiring during the pandemic and in the past year it has rewarded them for laying people off "because AI". And as the new year comes around they'll get rewarded for hiring again as they "accelerate development" even more. Our industry is really good at metric hacking and getting those numbers to keep going up. As long as it looks like a good decision then people are excited and the numbers go up.
I think the problem is we've perverted ("over optimized") the market. You have to constantly have stock growth. The goal is to become the best but you lose the game by winning. I think a good example of this is from an article a read a few months ago[1]. It paints AWS in a bad light but if you pull out the real data you'll see AWS had a greater increase in absolute users than GCloud (you can also estimate easily from the article). But with the stock market it is better to be the underdog with growth than the status quo with constant income[2].
What a weird way to optimize our businesses. You are rewarded for becoming the best, but you are punished for being the best. Feels like only a matter of time before they start tanking on purpose because you can't go up anymore, so you need to make room to go up[3]. I mean we're already trading on speculation. We're beyond tech demos pushing stock up (already speculative) and now our "demos" are not even demonstrations but what we envision tech that hasn't been built to look like. That's much more speculative than something that is in beta! IDK, does anyone else feel like this is insane? How far can we keep pushing this?
[0] Go to "Sector" then add "Software Development" to the chart https://data.indeed.com/#/postings
[1] https://www.reuters.com/business/world-at-work/amazon-target...
[2] Doesn't take a genius to figure out you'll make more money had you invested $100 in GCloud vs $100 in AWS (in this example). The percentile differential is all that matters. Being percentile punishes having a large existing userbase. You have double the percentile growth going from 1 user to 100 than from 10 million to 500 million, yet any person who isn't severely mentally incapacitated would conclude the latter is a better business.
[3] Or at least play a game of hot potato. Sounds like a collusion ring in waiting. e.g. AWS stagnates, lets Azure take a bunch of users, Azure stagnates and users switch to AWS. Gives both the ability to "grow" and I'm sure all the users will be super happy with constantly switching and all the extra costs of doing so...
i guess the ai returns are not there as soon as it was expected.
FAANG has shedded between 81,000 and 87,000 workers in the past 5 years; I suspect a significant chunk of these jobs aren't coming back.
Seems to me the companies are mostly in a holding pattern: sure, if an important project needs more bodies, it's probably okay to hire. I suspect that lots of teams have to make do until further notice.
Are some teams using AI instead of hiring junior engineers? I don't think there's any doubt about that. It's also a trial period to better understand what the value-add is.
Based on listening to engineers on various podcasts, almost all of them describe the current level of AI agents as being equivalent to a junior engineer: they're eager and think they know a lot but they also have a lot to learn. But we're getting closer to the point where a well-thought out Skill [1] can do a pretty convincing job of replacing a junior engineer.
But—at the rate AI is improving, a company that doesn't adopt AI for software engineering will be at a competitive disadvantage compared to its peers.
[1]: https://www.anthropic.com/engineering/equipping-agents-for-t...
> Based on listening to engineers on various podcasts, almost all of them describe the current level of AI agents as being equivalent to a junior engineer: they're eager and think they know a lot but they also have a lot to learn. But we're getting closer to the point where a well-thought out Skill [1] can do a pretty convincing job of replacing a junior engineer.
The people that comment as such are either so disconnected from the software development process or so bought in on the hype that they are forgetting what the point of a junior role is in the first place.
If you hire a junior and they're exactly as capable as a junior 3 years later (about how far we're in now) many organizations would consider letting that employee go. The point of hiring a junior is that you get a (relative to the market) cheap investment with a long-term payoff. Within 1-2 years if they are any good, they will not be very junior any more (depending on domain, of course). There is no such promise or guarantee with AI, and employing an army of junior engineers that can't really "learn" is not a future I want to live in as a mid-career senior-ish person.
Of course, you can say "oh, it'll improve, don't worry" but I live in the present and I simply do not see that. I "employ" a bunch of crappy agents I have to constantly babysit only to output more work "units" I could before at the cost of some quality. If I had spent the money on a junior I would only have to babysit for the first little while and then they can be more autonomous. Even if they can improve beyond this, relying on the moat of "AI" provider companies to make this happen is not exactly comfortable either.
3 replies →
One of those companies sent me a linkedin request with a specific role. I searched on their website and there were very few roles and none like the one they proposed. So they are doing stealth hiring now. You can't let the world know you need people when you are supposed to use AI.
It doesn't help that a lot of the graduates I've talked to or interviewed seemed to treat a compsci degree as nothing more than a piece of paper they needed to get to be handed a high paying tech job. If you're motivated enough to learn enough job skills to be useful on your own then I guess you can treat your degree that way. But if you got through 4 years through cheating and minmaxing the easiest route possible and wound up with no retained skills to show for it? Congrats, you played yourself and fell for the "college is useless" meme. Coulda just skipped the student loans and bombed interviews without the 4 year degree.
> Coulda just skipped the student loans and bombed interviews without the 4 year degree.
I think college is useless for the ones out there whom already know how to code, collaborate and other skills the industry is looking for. Many out there are developing high level projects on GitHub and other places without having any degree.
Also, most of the stuff you learn in college has absolutely no relation to what you will do in the industry.
Personally, I disagree. Software engineering encompasses a lot more than frontend dev work. In previous engineering positions, I’ve used linear regression, evolutionary computation, dynamic algorithms, calculus, image processing, linear algebra, circuit design, etc. almost all of which I originally learned as part of my computer science degree.
Just because you won't use it doesn't mean it's not useful. Lots of programmers use math. Lots of programmers use DSA knowledge on a daily basis - and if you aren't you're probably writing bad code. I see a lot of O(n^2) code or worse making apps slow for no reason. Pretty basic stuff that most people don't understand despite taking a whole class on it.
Sure I learned lots of stuff I've never used. Like relational algebra. But I also learned lots of stuff I use a lot, and it's unlikely I'd have studied most of that stuff on my own. During my degree I also had time and opportunity to pursue lots of other topics outside the mandated course material, you're not limited to what they force you to learn.
So sure if you have the motivation, discipline and resourcefulness to learn all that stuff on your own go right ahead. Most people aren't even close. Most people are much better off with a degree.
5 replies →
It has happened several times - junior web devs can't find jobs, junior Java devs can't find jobs, etc... usually after a surge wave in the related tech area. We had large overall surge in tech around Covid time, and as usually there is some adjustment now.
The dotcom bubble had comp sci lecture halls with students overflowing into the hallway. I don’t blame people, it’s migratory. Jobs and resources are there, so, go there.
Then we blame the other group of students for not going there and picking majors where the jobs aren’t.
We need some kind of apprenticeship program honestly, or AI will solve the thing entirely and let people follow their honest desires and live reasonably in the world.
1 reply →
AI? Ah, India.
"Over $50 billion in under 24 hours: Why Big Tech is doubling down on investing in India" https://www.cnbc.com/2025/12/11/big-tech-microsoft-amazon-go...
I can confirm from consulting experience that India is where the jobs went. My office provides professional services to North American and European industrial customers in manufacturing and distribution. Roughly 85% of these customers have fully Indian IT teams. Running a SOQL query in our Salesforce instance for 'Devi', 'Singh', and 'Kumar' yields over two thousand hits across client contacts, even.
Since the workers are hired for cost over quality, they're typically incompetent. Though many have learned to parasitize SME and support staff expertise by asking highly specific questions in an extended sequence. It's a salami-slicing strategy where the majority of the work ends up being performed by those SMEs and support staff while the incompetent workers collect the paychecks and credit. I'm pushing my teams to more aggressively identify and call out this behavior, but it's so systemic that it's an endless battle with every new project coming in the door.
Personal frustrations aside, it's very dangerous from both economic and national security perspectives for India to be building and administering so much of the West's IT infrastructure. Our entire economy depends on it, yet we're voluntarily concentrating that dependency in a single foreign nation. A Pacific conflict alone could sever us from the majority of our IT workforce, regardless of India's intentions.
Microsoft recently announced the intent to train 20 MILLION Indian workers.
Currently looking for a new role in biotech and it seems like at many companies it is almost 40:1 india vs united states roles being posted. This is in R&D not even manufacturing.
If the US actually cared about retaining jobs for the people they would enforce ratios of onshore / offshore with heavy taxes if companies did not reach that ratio.
Companies don't want to pay US salaries, cost of living in the US are not going down, costs of engineering talent in India is cheaper, you can hire 2 devs for the cost of 1 US dev. Why would you ever have any US engineering devs?
It won't change organically unless the costs of India engineers goes up or the costs of US engineers goes down.
2 replies →
Saying that "we're firing to use AI" makes you look like you have ROI on your AI investments and you're keeping up.
In fact there are possibly other macro-economic effects at play:
1. The inability to deduct engineering for tax purposes in the year they were spent: "Under the Tax Cuts and Jobs Act (TCJA) from 2017, the law requires companies to amortize (spread out) all domestic R&D expenses, including software development costs, over five years, starting in tax years after December 31, 2021, instead of deducting them immediately. This means if you spend $100,000 on software development in 2023, you can only deduct 1/5th (or $20,000) each year over five years"
2. End of zero-interest rates.
3. Pandemic era hiring bloat - let's be honest we hired too many non-technical people, companies are still letting attrition take place (~10%/yr where I am) instead of firing.
4. Strong dollar. My company is moving seats to Canada, Ireland, and India instead of hiring in the US. Getting 1.5-2 engineers in Ireland instead of 1 senior on the US west coast.
Otherwise AI is an accelerator to make more money, increase profits and efficiency. Yes it has a high cost, but so does/did Cloud, every SaaS product we've bought/integrated.
No it's not. There is no shortage of tech problems to solve and there are no tech jobs that AI can do alone.
AI is sucking up investment and AI hype is making executives stupid. Hundreds of billions of dollars that used to go towards hiring is now going towards data centers. But AI is not doing tech jobs.
These headlines do nothing but increase the hype by pointing towards the wrong cause entirely.
Edit: You cannot square these headlines https://news.ycombinator.com/item?id=46289160
It might be a question of where the seniors put their time: coaching juniors or working with AI tools.
My senior SWE job at FAANG has essentially turned into prompting Opus 4.5.
There is almost no reason to delegate the work, especially low level grunt work.
People disputing this are either in denial, or lacking the skill set to leverage AI.
One or two more Opus releases from anthropic and this field is cooked
What kind of work do you do that is simple enough that can be accomplished solely through prompting?
The golden handcuff type where you update documentation with new UI elements.
1 reply →
distributed systems, log diving, deployments, etc
What kind of work do you do that CANT BE divided enough into tasks that can be accomplished mostly through prompting?
any sort of web tech based development.
Frontend, backend, animations, design, infra, distributed systems engineering, networking.
It's a troll account called llmslave made a couple months ago. Odds are low it's even a human.
1 reply →
It all depends on how you prompt. and the prompt system you’ve setup.. when done well, you just “steer” the code /system. Quite amazing to see it come together. But there are multiple layers to this.
> lacking the skill set to leverage AI
It possible that your job is simply not that difficult to begin with?
yes, but so are most jobs like mine
What job is so difficult that LLMs cant allow an experienced user an order of magnitude gain in efficiency?
5 replies →
Unfortunatelly, i have the same experience.
It seems you've registered this account a couple of months ago only to basically repeat this opinion over and over (sprinkled with some anti-science opinions on top).
Really weird.
the world has changed, have you caught up?
3 replies →
The best part about your account is the people who don't understand the satire and unironically agree with you :D
username checks out
great engineering effort was spent to make software at FAANG built on clear service oriented modular architectures, and thus easy to develop for. Add to that good organization of process where engineers spend most of their time doing actual dev work.
Enterprise software is different beast - large fragile [quasi]monoliths, good luck for [current] AI to make a meaningful fixes and/or feature development in it. And even if AI manages to speed up actual development multiple times, the impact would be still small as actual development takes relatively small share of overall work in enterprise software. Of course it will come here too, just somewhat later than at places like FAANG.
Entry level jobs have been getting wiped out for at least 5 years, including tech jobs, which includes 2 years that not even ChatGPT 3.5 was available. That was the first version that would reasonably respond to any useful question. And if you're being honest, other entry level jobs are far worse of than tech jobs. Entry-level bakers ... outright don't really exist anymore.
Even agentic computing (ie. an AI doing anything on it's own accord for tech-savy users, never mind average users) is new from this year. I would argue it's still pretty far from widespread. Neither my wife nor my kids, despite my explaining repeatedly, even know what that is, never mind caring.
I'm repeating the mantra from before, and I get that it's not useful. But no, it's not AI wiping out entry-level jobs. It's governments failing to prop up the economy.
On the plus side, this means it can be fixed. However, I very much doubt the current morons in charge are going to ...
I’d go farther and guess that the tech job market would be even worse today without every company with at least 500 headcount (and many smaller than that), whether a tech company or not, putting money into “AI initiatives”.
I don’t think we’ve seen any amount of a net drop in tech jobs on account of LLMs (yet). I actually think they’re (spending on projects using them, that is) countering a drop that was going to happen anyway due to other factors (tightening credit being a huge one; business investment hesitation due to things like batshit crazy and chaotic handling of tariffs; consumer sentiment; et c)
Reminds me of that comic where the dog runs a ball up to his owner with the thought bubble "Throw!" When the owner goes to take the ball, the dog steps back, thinking, "No take! Only throw!"
So in the glorious future, well only need senior devs to manage AI. No juniors! Only seniors!
Unfortunately if it takes you 4 years to significantly upskill in tech, you are learning way too slow to survive in this industry. Most of the major innovators I know are dropouts, because they realized college is suited to train you to work in academia, where very few jobs exist, almost no one worth working for cares about degrees anymore, and the debt only makes surviving harder.
IMO the best education and credentials come from picking interesting projects you have no idea how to do, then learn everything in your way to ship them as open source so potential employers can see your work.
If you can get a degree on a scholarship for free, wonderful, but college should be viewed as more of a hobby or a way to network, rather than a way of obtaining marketable technical skills.
I don’t agree that “college is to train you to work in academia”.
I work in FAANG, none of my colleagues are dropouts.
Many BigTech founders are dropouts, but that’s a separate game altogether.
I would agree FAANGs are an exception who have historically hired almost exclusively academics who hire other academics, and it shows. They let many coast for years at a time and get away with being a specialist unable to deliver value outside of their specialization or rapidly learn new skills. Many get the job with academic success and treat the job as a continuation of their academic career. Many get "tenure" and can do whatever they want and are effectively paid to just not work for competitors.
I know lots of people working at those orgs that brag about how well they get away with doing nothing of value and we all know these people (but of course not everyone is like that).
No offense but I do not feel the overwhelming majority of roles at these companies are delivering value to humanity apart from shareholders, or something most people should aspire towards in a career, and do not think most of the skills learned in these orgs are all that useful in the world outside those walls.
Also those same FAANGs are clearly aware of the above at some level and doing mass layoffs, or not replacing people who leave, and those workers are having a really hard time finding a home in the non-FAANG working world where they are expected to be highly motivated generalists.
Outsourcing, end of ZIRP, end of R&D tax credit. Macro-economic conditions are pushing companies to do more with fewer people. AI might be helping with this, but it's pure marketing BS to blame it for the state of tech employment.
Not that like I think one should put too much stock in head lines. But "Wiping Out"
seems to translate to a 6.1% unemployment rate and 16.5% underemployment rate?
https://www.finalroundai.com/blog/computer-science-graduates...
I think the numbers you are arguing with here are for all employees, not just fresh graduates.
Blame the article for using suboptimal numbers, but the "wiping out" part is definitely justified when talking about jobs for graduates
When you see 6.1% unemployment for computer science new grads, that invariably comes from
https://www.newyorkfed.org/research/college-labor-market#--:...
Computer Science is tied for fourth lowest underemployment and is the 7th highest unemployment... and is also the highest early career median wage.
That needs to be compared to the underemployment chart https://www.newyorkfed.org/research/college-labor-market#--:... and the unemployment chart https://www.newyorkfed.org/research/college-labor-market#--:... (and make sure to compare that with 2009).
Computer science is not getting wiped out by AI. Entry level jobs exist, though people may need to reset their expectations (note that median job being $80k) from getting a $150k job out of college - that was always the exception rather than the average.
There are average jobs out there that people with a "want to be on the coast and $150k" or "must be remote so I don't relocate" are thumbing their nose at.
1 reply →
It would be justified if AI were actually the cause, but this article does nothing to prove that. The only "tech jobs" that can even demonstrate direct replacement are call-center type roles. Everything else is just loosely blamed on AI, which is a convenient scapegoat as billions of dollars of investment are redirected from hiring to building data centers.
>I think the numbers you are arguing with here are for all employees, not just fresh graduates.
If you click through to new york fed's website, the unemployment figures are 4.8% for "recent college graduates (aged 22-27)", 2.7% for all college graduates, and 4.0% for all workers. That's elevated, but hardly "wiping out".
The article refers to this article from May, which claims a 50% reduction in graduate tech hiring since pre-pandemic levels, 25% reduction since 2023
https://www.signalfire.com/blog/signalfire-state-of-talent-r...
The chart with that data is https://cdn.prod.website-files.com/6516123533d9510e36f3259c/...
Starting at 2019 and saying "pre-pandemic levels" might be a bit disingenuous since that was a leap to a boom... and the bust we're seeing now.
https://www.cbre.com/insights/articles/tech-boom-interrupted
Yes, tech hiring in 2025 is down from 2019. That's a lot like saying "tech hiring is down from 2000" in 2003.
1 reply →
What happens when there are no more entry-level humans to be promoted to mid-level, and so on?
> What happens when there are no more entry-level humans to be promoted to mid-level, and so on?
No business cares about that question, just like the Onceler didn't care how many Truffula trees were left. It's not their problem. Business is business, and business must grow, regardless of crummies in tummies, you know.
It even has a name, tragedy of the commons. I have been saying it constantly for the last few years with all this AI hype over LLM's going on. But with business focus really narrowing down to short time frames, what do you expect
The "business" doesn't care about this, but individual employees care about their job duties, not their business. And some of them do have a job duty where they care about this.
(i.e. this cynical complaint is exactly the opposite of the cynical complaint about managers/directors engaging in empire building.)
That line always hits hard whenever I read that story to my kids.
Well looked at what has always happened in society when young people have no hope for the future: massive societal disruption mostly in the forms of revolution + violence.
Since this isn't the 1800s anymore there won't be any major revolutions but I expect way more societal violence going forward. If you have no hope for the future it's not hard to go to very dark paths quickly, usually through no fault of your own sadly.
Now add how easy it is for malicious actors to get an audience and how LLM tech makes this even easier to do. Nice recipe for a powder keg.
> Since this isn't the 1800s anymore there won't be any major revolutions
I'm sure they were saying the same thing in the 1800s
Well I have an idea:
what if we all just blame the youth?
I think that might fix the situation
In the cobol world, lots of highly paid senior consultants, who come in and out of retirement to support systems.
Other than that, I am guessing junior roles will move offshore to supply the body shops where the corporate IT work has been going.
Big tech are doing it on purpose with h1b’s and exportation of labor to capture the market in India and non-china asia. they are desperate and afraid.
The U.S has a national security interest in completely stopping all of it. They dont, because every administration is paid not to.
Regulate tech, ban labor export, ban labor import, protect your countries from the sellout.
I don't see why you're being downvoted. Aside from being a little inflammatory your premise is correct.
It's not a secret companies do not want to hire Americans. Americans are expensive, demand too many benefits like fair pay, healthcare, and vacations. They also are (mostly) at-will. H1B solves all these problem. When that doesn't work, there's 400 Infosys-likes available to export that labor cheaply. We have seen this with several industries, the last most prominent one being auto manufacture.
All that matters is that the next quarters earnings are more than the last. No one hates the American worker more than Americans. Other countries have far better worker protections than us.
I see no reason H1B couldn't be solved by having an high barrier to entry (500k one time fee) and maintenance (100k per year). Then, force them to be paid at the highest bracket in their field. If H1Bs are what it's proponents say - necessary for rare talent not found else where - then this fee should be pennies on the value they provide. I also see no reason we can't tax exported labor in a similarly extreme manner. If the labor truly can't be found in America the high price of the labor on tax and fee terms should be dwarfed by their added value.
If it is not the case that high fees and taxes on H1B and exported labor make sense then the only conclusion is the vast majority of H1Bs and exported labor are not "rare talent" and thus aren't necessary. They can come through the normal immigration routes and integrate into the workforce as a naturalized American.
What exactly are the normal immigration routes? Employment-based immigration (H1B) is the only avenue that makes sense for a skilled worker. And usually skilled immigrants are the ones a country wants to attract.
2 replies →
Interesting. At least some of this has to be the bullwhip effect modeled with employers as retail, universities as suppliers, and graduating students as further back suppliers. The 4 year lead time in production of employable labour causes a whip crack backwards through the supply chain when there is a sudden shift at the retail end.
It's true that a lot of things which were once junior contributor things are now things I'd rather do, but my scarce resource is attention. And humans have a sufficiently large context window and self-agentic behaviour that they're still superior to a machine.
The 'recent graduates' quoted in this article all seem to be from (for lack of a better description) 'developing countries' hoping to get a (again, generalizing) 'high-paying FAANG job'.
My initial reaction would be that these people, unfortunately, got scammed, and that the scammers-promising-abundant-high-paying-jobs have now found a convenient scapegoat?
AI has done nothing so far to reduce the backlog of junior developer positions from where I can see, but, yeah, that's all in "Europoor" and "EU residency required" territory, so what do I know...
For the last few decades its been offshoring that filled the management agenda in the way AI does today so it doesn't seem surprising to me that the first gap would be in the places you might offshore a testing department to, etc.
Offshoring has the exact same benefits/problems that AI has (i.e: it's cheap, yet you have to specify everything in excruciating detail) and has not been a significant factor in junior hiring, like, ever, in my experience.
1 reply →
Currently helping with hiring and can't help but reflect on how it changed over past couple of years. We are now filtering for much stronger candidates across all experience levels, but junior side of the scale had been affected much more. Where previously we would take top 5% of junior applicants that made it past first phone screen, now it's below 2%.
> AI has done nothing so far to reduce the backlog of junior developer positions from where I can see
Job openings for graduates are significantly down in at least one developed nation: https://www.theguardian.com/money/2025/jun/25/uk-university-...
"This article was amended on 26 June 2025 to clarify that the link between AI and the decline in graduate jobs is something suggested by analysts, rather than documented by statistics"
Plus, that decline seems specious anyway (as in: just-about visible when you only observe the top-5% of the chart), plus, the UK job market has always been very different from the EU-they-left-behind.
Am I reading this article correctly: the job market was worse in 2017?
Was Ai also responsible for that market? This seems a bit unsupported.
1 reply →
And, as usual, no mention of the massive shortsighted overhiring during the post-covid bull market.
Again, in my experience, that simply never happened, at least not with regard to junior positions.
During COVID we were struggling to retain good developers that just couldn't deal with the full-remote situation[1], and afterwards, there was a lull in recent graduates.
Again, this is from a EU perspective.
[1] While others absolutely thrived, and, yeah, we left them alone after the pandemic restrictions ended...
2 replies →
Maybe rather than telling everyone to "learn to code" we could have told them to do jobs they are more suited to doing: serving food, nursing, construction etc. all which have tangible benefits to society.
When I went to Japan, it felt like all kinds of people were doing all kinds of jobs many hours into the day, whether it is managing an arcade, selling tickets at the station, working at a konbini or whatever small job. Maybe we need to not give such lofty ideas to the new generation and represent blue collar jobs as "foreigner" or "failure" jobs.
For that to work, we would first need to make those blue collar jobs into ones that actually pay well enough for people to thrive instead of merely survive
Both the economy and culture play a role in the types of jobs people aim for. Tech used to be for "nerds" now it is "cool" because they watched The Social Network and Bill Gates told them they are smart enough to be part of the club.
Japan doesn't do that. Those part time jobs don't pay very well. There's just much lower overhead to having them, you don't have to own a car or a giant house to be in commute range etc.
1 reply →
what if cool new tech is just slowing down and AI is masking it.
Not a "what if". Can you name 3 new cool technologies that have come out in the last 5 years?
1. Copilot for Microsoft PowerPoint
2. Copilot for Windows Notepad
3. Copilot for Windows 11 Start Menu
1 reply →
Yeah. Where are all the great new Mac native apps putting electron to shame, avalanche of new JS frameworks, and affordable SaaS to automate more of life? AI can write decent code, why am I not benefiting from that a consumer?
3 replies →
LLMs, Apple Silicon, self-driving cars just off the top of my head without really thinking about it.
8 replies →
Neura Link, Quantum computers are making interesting milestones with Microsoft releasing a processor chip for Quantum computing. Green steel is another interesting one, though not as 'sexy' as the previous two.
2 replies →
Incredibly cheaper batteries and solar panels. Much better induction stoves.
1 reply →
Uhhh, LLMs? The shit computers can do now is absurd compared to 2020. If you showed engineers from 2020 Claude, Cursor, and Stable Diffusion and didn't tell them how they worked their minds would be fucking exploding.
17 replies →
This article asserts 7 times that jobs are being replaced by AI and the only data to substantiate it is a link to an EY report that is paywalled, doesn't hold up to the text of the link, and doesn't hold up to what contemporary journalists wrote about the report.
Bad article. Hope a human didn't write it.
I am not a hiring manager. But if there is more supply of non-junior workers. Doesn't it make sense to actually hire those if compensation might even be lower than before? You hire juniors to support them, or because experienced devs are too expensive or there simply aren't enough of them. If there are enough of them on market for more reasonable price wouldn't actually choosing from that cohort make more sense?
I had the privilege of working with a great SWE intern this year. Their fresh ideas and strong work ethic made a real impact. Experienced engineers need this kind of energy.
Yes many over-rely on LLMs, but new engineers see possibilities we've stopped noticing and ask the questions we've stopped asking. Experience is invaluable, but it can quietly calcify into 'this is just how things are done.'
Everyone loves blaming AI for entry-level woes, but check the numbers: CS grads hit 6.1% unemployment while nursing sits at 1.4%. That's not "wiping out" jobs, that's oversupply meeting picky hiring after years of "learn to code" hype.
AI is eating the boring tasks juniors used to grind: data cleaning, basic fixes, report drafts. Companies save cash, skip the ramp-up, and wonder why their mid-level pipeline is drying up. Sarcastic bonus: great for margins, sucks for growing actual talent.
Long term though, this forces everyone to level up faster. Juniors who grok AI oversight instead of rote coding will thrive when the real systems engineering kicks in. Short term pain, massive upside if you adapt.
I will include this thread in the https://hackernewsai.com/ newsletter.
Basic coding to solve simple problems is something that high schoolers and even bright middle schoolers can do. By the time I was in college I had been coding for most of a decade. Part of the issue is that many of the folks coming out of school started learning this stuff WAY too late.
It's like if you waited until college to start learning to play piano, and wonder why you can't get a job when you graduate. You need a lot of time at the keyboard (pun intended) to build those skills.
Youth unemployment is up and among new hires in general bc of the uncertain and deteriorating business conditions.
We have new grads, they could not be replaced by AI. If you have new grads AI can replace I’m not sure why it required a college degree.
What's your field, broadly, if you don't mind sharing?
Evidence I can give in support of the article:
- very few teams have headcount, or expecting to grow - the number of interview requests get has dropped off a cliff.
So BigTech is definitely hiring less IMHO.
That said, I am not sure if it's only or even primarily due to replacement by AI. I think there's generally a lot of uncertainty about the future, and the AI investment bubble popping, and hence companies are being extra cautious about costs that repeat (employees) vs costs that can be stopped whenever they want (buying more GPUs).
And in parallel, they are hoping that "agents" will reduce some of the junior hiring need, but this hasn't happened at scale in practice, yet.
I would expect junior SWE hiring to slowly rebound, but likely stabilize at a slower pace than in the pre-layoff years.
> Evidence I can give in support of the article:
I only want to point out that evidence of less hiring is not evidence for AI-anything.
As others have pointed out, here and previously, things like outsourcing to India, or for Europe Eastern Europe, is also going strong. That's another explanation for less jobs "here", but they are not gone, they just moved to cheaper places. As has been going on for decades, it just continues unevenly.
https://www.cnbc.com/2025/12/11/big-tech-microsoft-amazon-go...
> Over $50 billion in under 24 hours: Why Big Tech is doubling down on investing in India
https://news.microsoft.com/source/asia/2025/12/09/microsoft-...
> Microsoft invests US$17.5 billion in India to drive AI diffusion at population scale
My personal experience is that it's not AI wiping out jobs, it's offshoring.
H1B and foreign worker visas are, AI is political cover and it's a lie.
If only there were some kind of tool that junior engineers could use to build portfolio pieces to differentiate themselves.
And if only there were some kind of tool that junior engineers could use to upskill themselves to mid-level.
Sadly we’ll have to wait until at least 2022 for that.
[dead]
[dead]