← Back to context

Comment by sgarland

11 hours ago

Agreed. The higher-ups at my company are, like most places, breathlessly talking about how AI has changed the profession - how we no longer need to code, but merely describe the desired outcome. They say this as though it’s a good thing.

They’re destroying the only thing I like about my job - figuring problems out. I have a fundamental impedance mismatch with my company’s desires, because if someone hands me a weird problem, I will happily spend all day or longer on that problem. Think, hypothesize, test, iterate. When I’m done, I write it up in great detail so others can learn. Generally, this is well-received by the engineer who handed the problem to me, but I suspect it’s mostly because I solved their problem, not because they enjoyed reading the accompanying document.

FWIW, when a problem truly is weird, AI & vibe coding tends to not be able to solve it. Maybe you can use AI to help you spend more time working on the weird problems.

When I play sudoku with an app, I like to turn on auto-fill numbers, and auto-erase numbers, and highlighting of the current number. This is so that I can go directly to the crux of the puzzle and work on that. It helps me practice working on the hard part without having to slog through the stuff I know how to do, and generally speaking it helps me do harder puzzles than I was doing before. BTW, I’ve only found one good app so far that does this really well.

With AI it’s easier to see there are a lot of problems that I don’t know how to solve, but others do. The question is whether it’s wasteful to spend time independently solving that problem. Personally I think it’s good for me to do it, and bad for my employer (at least in the short term). But I can completely understand the desire for higher-ups to get rid of 90% of wheel re-invention, and I do think many programmers spend a lot of time doing exactly that; independently solving problems that have already been solved.

  • You touch on an aspect of AI-driven development that I don't think enough people realize: choosing to use AI isn't all or nothing.

    The hard problems should be solved with our own brains, and it behooves us to take that route so we can not only benefit from the learnings, but assemble something novel so the business can differentiate itself better in the market.

    For all the other tedium, AI seems perfectly acceptable to use.

    Where the sticking point comes in is when CEOs, product teams, or engineering leadership put too much pressure on using AI for "everything", in that all solutions to a problem should be AI-first, even if it isn't appropriate—because velocity is too often prioritized over innovation.

    • > choosing to use AI isn't all or nothing.

      That's how I have been using AI the entire time. I do not use Claude Code or Codex. I just use AI to ask questions instead of parsing the increasingly poor Google search results.

      I just use the chat options in the web applications with manual copy/pasting back and forth if/when necessary. It's been wonderful because I feel quite productive, and I do not really have much of an AI dependency. I am still doing all of my work, but I can get a quicker answer to simple questions than parsing through a handful of outdated blogs and StackOverflow answers.

      If I have learned one thing about programming computers in my career, it is that not all documentation (even official documentation) was created equally.

Though it is not like management roles have ever appreciated the creative aspects of the job, including problem solving. Management has always wished to just describe the desired outcome and get magic back. They don't like acknowledging that problems and complications exist in the first place. Management likes to think that they are the true creatives for company vision and don't like software developers finding solutions bottom up. Management likes to have a single "architect" and maybe a single "designer" for the creative side that they like and are a "rising" political force (in either the Peter Principle or Gervais Principle senses) rather than deal with a committee of creative people. It's easier for them to pretend software developers are blue collar cogs in the system rather than white collar problem solvers with complex creative specialties. LLMs are only accelerating those mechanics and beliefs.

  • Agreed. I hate to say it, but if anyone thought this train of thought in management was bad now, it's going to get much worse, and unfortunately burnout is going to sweep the industry as tech workers feel evermore underappreciated and invisible to their leaders.

    And worse: with few opportunities to grow their skills from rigorous thinking as this blog post describes. Tech workers will be relegated to cleaning up after sloppy AI codebases.

    • I greatly agree with that deep cynicism and I too am a cynic. I've spent a lot of my career in the legacy code mines. I've spent a lot of my career trying to climb my way out of them or at least find nicer, more lucrative mines. LLMs are the "gift" of legacy-code-as-a-service. They only magnify and amplify the worst parts of my career. The way the "activist shareholder" class like to over-hype and believe in Generative AI magic today only implies things have more room to keep getting worse before they get better (if they ever get better again).

      I'm trying my best to adapt to being a "centaur" in this world. (In Chess it has become statistically evident that Human and Bot players of Chess are generally "worse" than the hybrid "Centaur" players.) But even "centaurs" are going to be increasingly taken for granted by companies, and at least for me the sense is growing that as WOPR declared about tic-tac-toe (and thermo-nuclear warfare) "a curious game, the only way to win is not to play". I don't know how I'd bootstrap an entirely new career at this point in my life, but I keep feeling like I need to try to figure that out. I don't want to just be a janitor of other people's messes for the rest of my life.

They’re destroying the only thing I like about my job - figuring problems out.

So, tackle other problems. You can now do things you couldn't even have contemplated before. You've been handed a near-godlike power, and all you can do is complain about it?

  • > You can now do things you couldn't even have contemplated before. You've been handed a near-godlike power, and all you can do is complain about it?

    This seems to be a common narrative, but TBH I don't really see it. Where is all the amazing output from this godlike power? It certainly doesn't seem like tech is suddenly improving at a faster pace. If anything, it seems to be regressing in a lot of cases.