Comment by dgritsko

8 hours ago

These blog posts are fascinating to read. I don't have a personal blog, but if I did I'm sure I would've written a very similar post as I've been wrestling with similar thoughts over the last few weeks. I have the distinct sense that I will look back on February 2026 as an inflection point, where AI crossed over from being an interesting parlor trick to something that fundamentally and irreversibly altered what I do day-to-day. It's bittersweet, for sure - it feels inevitable that the craft of software development that I've loved for years will be seen as an archaic relic at some point in the not too distant future. It may be several years yet before the impact is broadly felt (the full impact of today's frontier models has yet to be felt by the general public - to say nothing of models that will be released in the next few years) but this train doesn't seem to be slowing down anytime soon. This post was a helpful reminder that who I am is not defined by the code I write (or don't write) - there's so much more to life than code.

One part of me tries to resist and tell you that our craft is not becoming an archaic relic, the other half already knows you‘re right. We just can‘t put the ghost back into the bottle and now‘s a good time to re-calibrate your passion.

  • I look at it like this: Yes, AI can write code. It can write it much faster than I can. Sometimes it can also write it better than I can.

    But: programming languages, libraries, and abstractions are not going away. It is still possible (and might always be possible) to get deep into the weeds of Python or Rust or whatever to understand how those work and really harness them to their full potential, or develop them further. It just won't be _compulsary_ (in most industries) if your only goal is to trade lines of code for dollars in your bank account.

    • I mostly share your perspective, but I don't know if I would share your emphasis.

      Lines of code for dollars used to be a trade businesses made with developers out of necessity, but soon it will only be economically viable to make that trade with AI providers. So not only will going deep in the weeds not be compulsory, understanding anything about any programming concept will become economically void (though not void of educational value, or enjoyment).

      On the other hand, what that code does depends entirely on a particular understanding of the real world, which is indescribably complex (i.e. combinatorially explosive). This is what I truly care about, and the possibilities for the application and customization of software are infinite. The interface between the world and software will always involve a value decision that AI cannot have a monopoly over (it would be economically infeasible, no matter how cheap inference becomes). This means that as long as my passion is not within the machine, but is instead centered on the relationship between the machine and the world, I will never be out of a job.

      And part of me thinks, "good riddance!". For all the good we created, developers have also generated so much bullshit, it's honestly insane that any software companies were ever successful in spite of it. The human-politicking is probably the worst of it - think of the countless years of human life wasted in scrum ceremonies - but also so much of the software we've created sucks, and users hate it!

      We used to be a proud culture of hackers, building miracles with miniscule resources, or at least that's what the greybeards here on HN like to whine about. They're right, we've squandered limitless cycles, uncountable exabytes of useless data. If there was a God of hackerdom, we are living in his Gomorrah, and he will strike us down with AI as punishment for these sins.

> (the full impact of today's frontier models has yet to be felt by the general public - to say nothing of models that will be released in the next few years)

We definitely saw some kind of non-linear step function jump in quality around the beginning of the year - it's hard to express how good Claude opus/sonnet 4.6 is now. However, I wonder if we're going to see the same kind of improvement from here? It's kind of like we got to the 80% point but the next 20% is going to be a lot harder/take longer than that first 80% (pareto principle). Also, as more and more code out there is AI generated it's going to be like the snake eating it's own tail. Training models on AI generated code doesn't seem like it will lead to improvements.