← Back to context

Comment by onion2k

3 days ago

I love AI for speed running through all the boring stuff and getting to the good parts.

In some cases, especially with the more senior devs in my org, fear of the good parts is why they're against AI. Devs often want the inherent safety of the boring, easy stuff for a while. AI changes the job to be a constant struggle with hard problems. That isn't necessarily a good thing. If you're actually senior by virtue of time rather than skill, you can only take on a limited number of challenging things one after another before you get exhausted.

Companies need to realise that AI to go faster is great, but there's still a cognitive impact on the people. A little respite from the hardcore stuff is genuinely useful sometimes. Taking all of that away will be bad for people.

That said, some devs hate the boring easy bits and will thrive. As with everything, individuals need to be managed as individuals.

That makes me think of https://store.steampowered.com/app/2262930/Bombe/ which is a version of Minesweeper where instead of clicking on squares you define (parametric!) rules that propagate information around the board automatically. Your own rules skip all the easy parts for you. As a result, every challenge you get is by definition a problem that you've never considered before. It's fun, but also exhausting.

  • I remember listening to a talk about Candy Crush and how they designed the game to have a few easy levels in between the hard ones, to balance feeling like you're improving while also challenging players. If all the levels get progressively harder, then a lot of people lose motivation to keep playing.

  • Oooohhh....

    That looks like plenty of hours of fun! Thanks for the link :)

  • I just tried that game and it looks very interesting but the interface is cryptic.

Interesting point.

There's also the fact that, while you're coding the easy stuff, your mind is thinking about the hard stuff, looking things up, seeing how they articulate. If you're spending 100% of your time on hard stuff, you might be hurting these preliminaries.

  • This makes no sense. Yes, having time to think about the hard part is good, but just because you’re not doing the boilerplate anymore doesn’t mean you can’t do the thinking part anymore! See how absurd it sounds when you actually describe it this way?

    • Let me rephrase.

      I know brilliant people who took up knitting to keep their hands busy while they think over their difficult problems. But that only works if you can knit in your work hours. Sadly, despite clearly improving the productivity of these people, this is a fireable offense in many jobs.

      I'm not saying that the only way to think through a hard problem is to work on boilerplate. If you are in a workplace where you can knit, or play table soccer, by all means, and if these help you, by all means, go for it.

      What I'm thinking out loud is that if we're dedicating 100% of our time to the hard problems, we'll hit a snag, and that boilerplate may (accidentally) serve as the padding that makes sure we're not at these 100%.

      That being said, I'm not going to claim this as a certainty, just an idea.

      3 replies →

> Devs often want the inherent safety of the boring, easy stuff for a while

That's matches my experience. In my first job, every time a new webapp project has been starting it was fun. Not because of challenges or design, but simply because of the trivial stuff done for n-th time - user accounts, login, password reset, admin panel. Probably should have been automated at this point, but we got away with reinventing the wheel each time.

>AI changes the job to be a constant struggle with hard problems

I find this hilarious. From what I've seen watching people do it, it changes the job from deep thought and figuring out a good design to pulling a lever on a slot machine and hoping something good pops out.

The studies that show diminished critical thinking have matched what i saw anecdotally pairing with people who vibe coded. It replaced deep critical thinking with a kind of faith based gambler's mentality ("maybe if i tell it to think really hard it'll do it right next time...").

The only times ive seen a notable productivity improvement is when it was something not novel that didnt particularly matter if what popped out was shit - e.g. a proof of concept, ad hoc app, something that would naturally either work or fail obviously, etc. The buzz people get from these gamblers' highs when it works seems to make them happier than if they didnt use it at all though.

  • Which was my original point. Not that the outcome is shit. So much of what we write is absolutely low-skill and low-impact, but necessary and labor-intensive. Most of it is so basic and boilerplate you really can't look at it and know if it was machine- or human-generated. Why shouldn't that work get cranked out in seconds instead of hours? Then we can do the actual work we're paid to do.

    To pair this with the comment you're responding to, the decline in critical thinking is probably a sign that there's many who aren't as senior as their paycheck suggests. AI will likely lead to us being able to differentiate between who the architects/artisans are, and who the assembly line workers are. Like I said, that's not a new problem, it's just that AI lays that truth bare. That will have an effect generation over generation, but that's been the story of progress in pretty much every industry for time eternal.

    • > So much of what we write is absolutely low-skill and low-impact, but necessary and labor-intensive. Most of it is so basic and boilerplate you really can't look at it and know if it was machine- or human-generated.

      Is it really? Or is it a refusal to do actual software engineering, letting the machine taking care of it (deterministically) and moving up the ladder in terms of abstraction. I've seen people describing things as sludge, but they've never learned awk to write a simple script to take care of the work. Or learned how to use their editor, instead using the same pattern they would have with Notepad.

      I think it's better to take a step back and reflect on why we're spending time on basic stuff instead. Instead of praying that the LLM will generate some good basic stuff.

      2 replies →

    • >So much of what we write is absolutely low-skill and low-impact, but necessary and labor-intensive.

      Ive never found this to be true once in my career.

      I know a lot of devs who looked down on CRUD or whatever it was they were doing and produced boilerplate slop though.

      Code isnt like pottery. There arent "artisanal" devs who produce lovely code for people to look at like it's a beautiful vase. Good code that is hooked into the right product-market fit can reach millions of people if it works well.

      The world was replete with shitty code before AI and mostly it either got tossed away or it incurred epic and unnecessary maintenance costs because it actually did something useful. Nothing has changed on that front except the tsunami of shit got bigger.

  • I think there are two kinds of uses for these tools:

    1) you try to explain what you want to get done

    2) you try to explain what you want to get done and how to get it done

    The first one is gambling, the second one has very small failure rate, at worst, the plan it presents shows it's not getting the solution you want it to do.

    • The thing is to understand that a model has "priors" which steer how it generates code. If what you're trying to build matches the priors of the model you can basically surf the gradients to working software with no steering using declarative language. If what you want to build isn't well encoded by the models priors it'll constantly drift, and you need to use shorter prompts and specify the how more (imperative).

      1 reply →

This is an interesting take on it. I've been using Cursor at work and notice that days where I have it generate all the "easy parts", I end up feeling much more mentally exhausted than if I spend the day writing everything by hand.

  • That tracks. Being in flow on a task you're confident about is a low arousal, satisfying state.

    Monitoring AI output on any task is high arousal, low satisfaction, unless you're constantly prompting for quick wins.

> AI changes the job to be a constant struggle with hard problems.

Very true. I think AI (especially Claude Code) forced me to actually think hard about the problem at hand before implementing the solution. And more importantly, write down my thoughts before they fleet away from my feeble mind. A discipline that I wished I had before.

  • That's strange, I've never thought it can be done this way. Normally I'd read the docs, maybe sketch up some diagrams, then maybe take a walk while thinking on how to solve the problem, and by the time I got back to the screen I'd already have a good idea on how to do it.

    These days the only difference is that I feed my ideas to a few different LLMs to have "different opinions". Usually they're crap but sometimes they present something useful that I can implement.

Yes I can see why some devs might prefer the safety of the boring and the familiar stuff. But the employers aren’t going to care about that. They’re going to hire and retain the devs who are more productive in the new era.

On the flip side, there have been lots of times where I personally didn’t have a lot of time to deeply research a topic (read papers, build prototypes of different ideas, etc) due to lack of time and resources. If all of the boring stuff is gone, and building prototypes is also 3x faster maybe what will end up happening is we can now use all of this free time to try lots of different ideas because the cost of exploration has been lowered.

I think you're describing things we already knew long before this era of AI. Less code is better code, and the vast majority of bugs come from the devs who "hate the boring easy bits".

I disagree that this has anything to do with people needing a break. All code eventually has to be reviewed. Regardless of who or what wrote it, writing too much of it is the problem. It's also worth considering how much more code could be eliminated if the business more critically planned what they think they want.

These tensions have existed even before computers and in all professions.

That's crazy to me. I solve problems. I'm not a janitor or tradesman, you bring me in to envision and orchestrate solutions that bring bottom line value. I live to crack hard nuts, if I never have to bother with rigging again I'll be so happy.

> Devs often want the inherent safety of the boring, easy stuff for a while.

Part of the problem is that in big orgs, you need to show consistent progress in order to not get put on some PIP and kicked out of the company. There are performance review cycles and you have to show something continuously.

That ONLY works if you have boring, easy work. It's easy to deliver consistent progress on that.

Interesting and difficult work is nice only if you are trusted to try your best and given the freedom to fail. That's the nature of hard problems; progress in those domains is very sudden and Poissonian and not consistent by nature. If you're going to be judged on your ability to be sub-Poissonian and consistent, and get put on a PIP for not succeeding at it one review cycle (and possibly risking income that you use to put a roof over your head or feed your family) it's not worth the career risk to try difficult things.

Not saying this is the way I think, it's just the reality of how things often work in big orgs, and one of the reasons I dislike many big orgs.

This is exactly why people hate AI. It disrupts the comfort of easy coding.

  • The main challenge with any creative effort, including and especially programming, is motivation. "Easy coding" gives you small mental wins that build your dopamine circuits and give you something to chase. When people say that "AI takes the fun out of coding" they mean that they're not getting those rewards anymore. It might make coding easier (though I'm not sure it actually does, ultimately), but in the process it takes away the motivation.

    The ones who are excited about this are the ones who are motivated by the product. When AI can whip up some half-baked solution it sure looks like you can focus on the product and "get someone to code it up for you". But unless it's a well-understood and previously executed solution, you're going to run into actual technical problems and have to fix them. But your motivation to deal with the irritating pedantrics of the modern computing stack (which are the same as all technology ever, with orders of magnitude more parts) hasn't been built up. There's no beneficial flywheel, just a fleet of the Sorceror's Apprentice mindless brooms that you hope you can get work enough to ship.

> In some cases, especially with the more senior devs in my org, fear of the good parts is why they're against AI. Devs often want the inherent safety of the boring, easy stuff for a while. AI changes the job to be a constant struggle with hard problems. That isn't necessarily a good thing. If you're actually senior by virtue of time rather than skill, you can only take on a limited number of challenging things one after another before you get exhausted.

The issue of senior-juniors has always been a problem; AI simply means they're losing their hiding spots.