Comment by Nextgrid

20 days ago

LLMs are only a threat if you see your job as a code monkey. In that case you're likely already obsoleted by outsourced staff who can do your job much cheaper.

If you see your job as a "thinking about what code to write (or not)" monkey, then you're safe. I expect most seniors and above to be in this position, and LLMs are absolutely not replacing you here - they can augment you in certain situations.

The perks of a senior is also knowing when not to use an LLM and how they can fail; at this point I feel like I have a pretty good idea of what is safe to outsource to an LLM and what to keep for a human. Offloading the LLM-safe stuff frees up your time to focus on the LLM-unsafe stuff (or just chill and enjoy the free time).

I see my job as having many aspects. One of those aspects is coding. It is the aspect that gives me the most joy even if it's not the one I spend the most time on. And if you take that away then the remaining part of the job is just not very appealing anymore.

It used to be I didn't mind going through all the meetings, design discussions, debates with PMs, and such because I got to actually code something cool in the end. Now I get to... prompt the AI to code something cool. And that just doesn't feel very satisfying. It's the same reason I didn't want to be a "lead" or "manager", I want to actually be the one doing the thing.

  • You won't be prompting AI for the fun stuff (unless laying out boring boilerplate is what you consider "fun"). You'll still be writing the fun part - but you will be able to prompt beforehand to get all the boilerplate in place.

    • If you’re writing that much boilerplate as part of your day to day work, I daresay you’re Doing Coding Wrong. (Virtue number one of programming: laziness. https://thethreevirtues.com)

      Any drudgework you repeat two or three times should be encapsulated or scripted away, deterministically.

      6 replies →

There are many tens (hundreds?) of billions of dollars being poured into the smartest minds in the world to push this thing forward

I'm not so confident that it'll only be code monkeys for too long

  • Until they can magically increase context length to such a size that can conveniently fit the whole codebase, we're safe.

    It seems like the billions so far mostly go to talk of LLMs replacing every office worker, rather than any action to that effect. LLMs still have major (and dangerous) limitations that make this unlikely.

  • > the smartest minds in the world

    Dunning–Kruger is everywhere in the AI grift. People who don't know a field trying to deploy some AI bot that solves the easy 10% of the problem so it looks good on the surface and assumes that just throwing money (which mostly just buys hardware) will solve it.

    They aren't "the smartest minds in the world". They are slick salesmen.

    • The other day someone referred to Claude Code as “the most complex terminal app” they’ve seen.

      Meanwhile folks are rendering videos in the terminal.

      2 replies →

Agreed. Programming languages are not ambiguous. Human language is very ambiguous, so if I'm writing something with a moderate level of complexity, it's going to take longer to describe what I want to the AI vs writing it myself. Reviewing what an AI writes also takes much longer than reviewing my own code.

AI is getting better at picking up some important context from other code or documentation in a project, but it's still miles away from what it needs to be, and the needed context isn't always present.

Nevermind coding where is the llm for legal stuff? Why are all these programmers working on automating their job away instead of those bloodsucking lawyers who charge hundreds of eur per h.

  • They are, you probably just aren't hearing about it. There's been loads of cases over the past few years where lawyers use AI to automate their legal research, then get admonished by the judge because their court filings contain fake quotes or reference court cases that don't even exist. A few examples: https://calmatters.org/economy/technology/2025/09/chatgpt-la... https://natlawreview.com/article/judge-issues-public-admonit... https://websitedc.s3.amazonaws.com/documents/Mezu_v._Mezu_US...

  • It’s happening as fast for them. I literally sit next to our general counsel all day at the office. We work together continually. I show him things happening in engineering, and each time he shows me the analogous things happening in legal.

    This affects everyone.

  • Domain knowledge and gatekeeping. We don't know what is required in their role fully, but we do know what is required in ours. We also know that we are the target of potentially trillions in capital to disrupt our job and that the best and brightest are being paid well just to disrupt "coding". A perfect storm of factors that make this faster than other professions.

    It also doesn't help that some people in this role believe that the SWE career is a sinking ship which creates an incentive to climb over others and profit before it tanks (i.e. build AI tools, automate it and profit). This is the typical "It isn't AI, but the person who automates your job using AI that replaces you".

  • There are many. My friend (a lawyer and a programmer) wrote one from scratch in the basement. This would have been a 4 person startup before.

Why is that safe in the medium to long term? If LLMs can code monkey already after just 4 years, why assume in a couple more they can’t talk to the seniors’ direct report and get requirements from them? I’m learning carpentry just in case.

Yes. And I'm excited as hell.

But I also have no idea how people are going to think about what code to write when they don't write code. Maybe this is all fine, is ok, but it does make me quite nervous!

  • That is definitely a problem, but I would say it’s a problem of hiring and the billion-dollars worth of potential market cap resting on performative bullshit that encourages companies to not hire juniors to send a signal to capture some of those billions regardless of actual impact on productivity.

    LLMs benefit juniors, they do not replace them. Juniors can learn from LLMs just fine and will actually be more productive with them.

    When I was a junior my “LLM” was StackOverflow and the senior guy next to me (who no doubt was tired of my antics), but I would’ve loved to have an actual LLM - it would’ve handled all my stupid questions just fine and freed up senior time for the more architectural questions or those where I wasn’t convinced by the LLM response. Also, at least in my case, I learnt a lot more from reading existing production code than writing it - LLMs don’t change anything there.

    • I agree that they can be used this way, and it would be less of a problem if they were. However, the current evidence we see from universities is that those who use LLMs to actually learn something are in the minority. The dopamine hit of something working without having had to do anything for it is much stronger.

I see what these can do and I'm already thinking, why would I ever hire a junior developer? I can fire up opencode and tell it to work multiple issues at once myself.

The bottleneck becomes how fast you can write the spec or figure out what the product should actually be, not how quickly you can implement it.

So the future of our profession looks grim indeed. There will be far fewer of us employed.

I also miss writing code. It was fun. Wrangling the robots is interesting in its own way, but it's not the same. Something has been lost.

  • You hire the junior developer because you can get them to learn your codebase and business domain at a discount, and then reap their productivity as they turn senior. You don’t get that with an LLM since it only operates on whatever is in its context.

    (If you prefer to hire seniors that’s fine too - my rates are triple that of a junior and you’re paying full price for the time it takes me learning your codebase, and from experience it takes me at least 3 months to reach full productivity.)

  • > why would I ever hire a junior developer

    Because a junior developer doesn't stay a junior developer forever. The value of junior developers has never been the code they write. In fact, in my experience they're initially a net negative, as more senior developers take time to help them learn. But it's an investment, because they will grow into more senior developers.

    • The question really is what you think the long term direction of SWE as a profession is. If we need juniors later and senior's become expensive that's a nice problem to have mostly and can be fixed via training and knowledge transfer. Conversely people being hired and trained, especially when young into a sinking industry isn't doing anyone any favors.

      While I think both sides have an argument on the eventual SWE career viability there is a problem. The downsides of hiring now (costs, uncertainity of work velocity, dry backlogs, etc) are certain; the risk of paying more later is not guaranteed and maybe not as big of an issue. Also training juniors doesn't always benefit the person paying.

      * If you think long term that we will need seniors again (industry stays same size or starts growing again) given the usual high ROI on software most can afford to defer that decision till later. Goes back to pre-AI calculus and SWE's were expensive then and people still payed for them.

      * If you think that the industry shrinks then its better to hold off so you get more out of your current staff, and you don't "hire to fire". Hopefully the industry on average shrinks in proportion to natural retirement of staff - I've seen this happen for example in local manufacturing where the plant lives but slowly winds down over time and as people retire they aren't replaced.

      1 reply →

LLMs are a threat to the quality of code in a similar - but much more dramatic - way to high level languages and Electron. I am slightly worried about keeping a job if there's a downturn, but I'm much more worried about my job shifting into being the project manager for a farm of slop machines with no taste and a complete inability to learn.

I think it’s naive to think that not every part of our jobs will worryingly soon be automated. All the way up to and inckuding CEO. This is not exciting.

If you believe juniors are already not safe, it’s only a question of time before seniors are in the same position. First they came for the socialists, etc etc.