← Back to context

Comment by reenorap

14 hours ago

The cope-ism in this blog post is palpable. The author is genuinely offended that someone who doesn't know how to code is daring to invade his turf. It's pretty sad that this is how he is reacting.

I, for one, welcome the new paradigm shift of vibe coders entering the field. I still think I have a competitive advantage with my 30+ years of coding experience, but I don't think it's wrong for vibe coders to enter my turf. I think value of code is rapidly asymptotically to ZERO. Code has no value anymore. It doesn't matter if it's slop as long as it works. If you are one of the ones that believes that all code written by humans is sacred and infallible, you probably don't have a lot of experience working in many companies. Most human code is garbage anyway. If it's AI-generated, at least it's based on better best principles and if it's really bad you just need to reprompt it or wait for a newer version of the AI and it will automatically get better.

THIS IS THE NEW PARADIGM. THINKING YOU HAVE ANY POWER TO SWAY THE FUTURE AWAY FROM THIS PATH IS FOOLISH.

I'm currently running a migration program at work and it turns out there's a 10 MB limit to the number of entries I can batch over at one time. At first I asked AI to copy 10 rows per batch but that was too slow. Then I asked it to change the code to do 400 rows per batch but sometimes it failed because it exceeded the 10 MB limit. Then I said just collect the number of rows until you get 10 MB and then send it off. This is working perfectly and now I'm running it without any hitches so far. Then I asked it to add an estimate to how long it would take to finish after every batch, including end time.

I really love this new world we're living in with AI coding. Sure this could have been done by someone without experience, but at least for right now the ideas I can come up with are much better than those without any experience, and that's hopefully the edge that keeps me employed. But whatever the new normal is, I'm ready to adapt.

Would you have some random bloke with ChatGPT work on and test the electrical wiring in your house? Or would you prefer someone who actually understands what they are doing? What about software that calculates expensive material requirements and cutting? What about medical software that can make decisions about your health?

It just sounds like you work on very low stakes software, probably CRUD apps if I had to guess. But software can be a lot more than. If written competently it can make decisions and do calculations that have real consequences.

i too find lots of value in llms but your example describes a scenario a programmer could have also easily solved and maybe even had writing it correctly in the first or second shot.

that isn't to say an llm can't be useful but your post implies it's inevitable that llms will replace humans entirely from writing code, which i think is incredibly optimistic at best.

that said we will see!

nothing foolish about trying even if he too thinks it's inevitable. it's foolish however to think that there won't be nuances of such a future (and somehow no one can influence the nuances).

> It doesn't matter if it's slop as long as it works

I agree with most of what you said, but that statement doesn't take the time dimension into account. Slop accumulates, and eventually becomes unmanagable. We need to teach AI to become lean engineers too.

  • I have only seen AI make codebases better, and I'm talking about it making some pretty nuanced changes. I think mass-rewriting of projects is possible these days with AI.

    • I disagree on both fronts. Unguided AI can be a very efficient tech debt generator.

    • just last week AI led a developer on our team to brick our git history when he was attempting to fix a deploy. he's not a git expert but an llm should of not led him that far astray, no?

      i see on a weekly basis where if an llm was left to do what its initial direction was without human oversight it would have broken otherwise working programs