← Back to context

Comment by dkdbejwi383

1 day ago

Fair enough but I am a programmer because I like programming. If I wanted to be a product manager I could have made that transition with or without LLMs.

Agreed. The higher-ups at my company are, like most places, breathlessly talking about how AI has changed the profession - how we no longer need to code, but merely describe the desired outcome. They say this as though it’s a good thing.

They’re destroying the only thing I like about my job - figuring problems out. I have a fundamental impedance mismatch with my company’s desires, because if someone hands me a weird problem, I will happily spend all day or longer on that problem. Think, hypothesize, test, iterate. When I’m done, I write it up in great detail so others can learn. Generally, this is well-received by the engineer who handed the problem to me, but I suspect it’s mostly because I solved their problem, not because they enjoyed reading the accompanying document.

  • FWIW, when a problem truly is weird, AI & vibe coding tends to not be able to solve it. Maybe you can use AI to help you spend more time working on the weird problems.

    When I play sudoku with an app, I like to turn on auto-fill numbers, and auto-erase numbers, and highlighting of the current number. This is so that I can go directly to the crux of the puzzle and work on that. It helps me practice working on the hard part without having to slog through the stuff I know how to do, and generally speaking it helps me do harder puzzles than I was doing before. BTW, I’ve only found one good app so far that does this really well.

    With AI it’s easier to see there are a lot of problems that I don’t know how to solve, but others do. The question is whether it’s wasteful to spend time independently solving that problem. Personally I think it’s good for me to do it, and bad for my employer (at least in the short term). But I can completely understand the desire for higher-ups to get rid of 90% of wheel re-invention, and I do think many programmers spend a lot of time doing exactly that; independently solving problems that have already been solved.

    • You touch on an aspect of AI-driven development that I don't think enough people realize: choosing to use AI isn't all or nothing.

      The hard problems should be solved with our own brains, and it behooves us to take that route so we can not only benefit from the learnings, but assemble something novel so the business can differentiate itself better in the market.

      For all the other tedium, AI seems perfectly acceptable to use.

      Where the sticking point comes in is when CEOs, product teams, or engineering leadership put too much pressure on using AI for "everything", in that all solutions to a problem should be AI-first, even if it isn't appropriate—because velocity is too often prioritized over innovation.

      1 reply →

  • Though it is not like management roles have ever appreciated the creative aspects of the job, including problem solving. Management has always wished to just describe the desired outcome and get magic back. They don't like acknowledging that problems and complications exist in the first place. Management likes to think that they are the true creatives for company vision and don't like software developers finding solutions bottom up. Management likes to have a single "architect" and maybe a single "designer" for the creative side that they like and are a "rising" political force (in either the Peter Principle or Gervais Principle senses) rather than deal with a committee of creative people. It's easier for them to pretend software developers are blue collar cogs in the system rather than white collar problem solvers with complex creative specialties. LLMs are only accelerating those mechanics and beliefs.

    • Agreed. I hate to say it, but if anyone thought this train of thought in management was bad now, it's going to get much worse, and unfortunately burnout is going to sweep the industry as tech workers feel evermore underappreciated and invisible to their leaders.

      And worse: with few opportunities to grow their skills from rigorous thinking as this blog post describes. Tech workers will be relegated to cleaning up after sloppy AI codebases.

      1 reply →

  • They’re destroying the only thing I like about my job - figuring problems out.

    So, tackle other problems. You can now do things you couldn't even have contemplated before. You've been handed a near-godlike power, and all you can do is complain about it?

    • > You can now do things you couldn't even have contemplated before. You've been handed a near-godlike power, and all you can do is complain about it?

      This seems to be a common narrative, but TBH I don't really see it. Where is all the amazing output from this godlike power? It certainly doesn't seem like tech is suddenly improving at a faster pace. If anything, it seems to be regressing in a lot of cases.

I’m a programmer (well half my job) because I was a short (still short) fat (I got better) kid with a computer in the 80s.

Now, the only reason I code and have been since the week I graduated from college was to support my insatiable addictions to food and shelter.

While I like seeing my ideas come to fruition, over the last decade my ideas were a lot larger than I could reasonably do over 40 hours without having other people working on projects I lead. Until the last year and a half where I could do it myself using LLMs.

Seeing my carefully designed spec that includes all of the cloud architecture get done in a couple of days - with my hands on the wheel - that would have taken at least a week with me doing some work while juggling dealing with a couple of other people - is life changing

  • Not sure why this is getting downvoted, but you're right — being able to crank out ideas on our own is the "killer app" of AI so to speak.

    Granted, you would learn a lot more if you had pieced your ideas together manually, but it all depends on your own priorities. The difference is, you're not stuck cleaning up after someone else's bad AI code. That's the side to the AI coin that I think a lot of tech workers are struggling with, eventually leading to rampant burnout.

    • What would I learn that I don’t already know? The exact syntax and property of Terraform and boto3 for every single one of the 150+ services that AWS offers? How to modify a React based front end written by another developer even though I haven’t and have actively stayed away from front end development for well over a decade?

      Will a company pay me more for knowing those details? Will I be more affectively able to architect and design solutions that a company will pay my employer to contract me to do and my company pays me? They pay me decently not because I “codez real gud”. They pay me because I can go from empty AWS account, empty repo and ambiguous customer requirements to a working solution (after spending time talking to a customer) to a full well thought out architecture + code on time on budget and that meets requirements.

      I am not bragging, I’m old those are table stakes to being able to stay in this game for 3 decades

I became an auto mechanic because I love machining heads, and dropping oil pans to inspect, and fitting crankshafts in just right, and checking fuel filters, and adjusting alternators.

If I wanted to work on electric power systems I would have become an electrician.

(The transition is happening.)