Comment by ein0p
1 year ago
I'm a principal SWE with 25 years of experience and I think software today is comically bad and way too hard to use. So I think we can get engineers to write better software with these tools. The talk of "replacement" is going to be premature until we get something remotely resembling AGI. Unless your problems are so simple that a monkey could solve them, AI of today and foreseeable future is not going to solve them end to end. At best it'll fill in the easy parts, which you probably don't want to do anyway. Write a test. Simple refactor. Bang out some simple script to pay down some engineering debt. I've yet to see a system that doesn't crap out in the very beginning on the real problems that I solve on a daily basis. I'm by no means a naysayer - I work in this field and use AI many times daily.
Funny enough, now I write better code than I used to thanks to AI because of two reasons:
- AI naturally writes AI code that is more organized and clean (proper abstraction, no messy code)
- I've recognized that, for AI to write code on an existing codebase, the code has to be clean and organized and make sense, so I tend to do more refactoring to make sure AI can take over them and update them when needed.
>Funny enough, now I write better code than I used to thanks to AI because of two reasons:
I assume you also believe you'll be one of the developers AI doesn't replace.
I'm actively transitioning out of a "software engineer" role to be more open minded on how to coexist with AI while still contributing value.
Prompt engineering, organizing code for AI agents to be more effective, guiding non-technical people to understand how to leverage AI, etc. I'm also building products myself and selling them myself.
Today an AI told me that non-behavioral change in my codebase was going to give us a 10x improvement on our benchmarks.
Frankly, if you were writing code that is worse structured than what GPT or whatever generates today then you are just a mediocre developer.
See, the thing is, to determine which abstractions are "right and proper" you _already need a software engineer_ who knows those kinds of things. Moreover, that engineer needs to ability to read that code, understand it, and plan its evolution over time. He/she also needs to be able to fix the bugs, because there will be bugs.
I think your main thesis is that "AI of today and foreseeable future is not going to solve them end to end."
My belief is that we can't solve them today (agree with you), but we can solve them in foreseeable future (in 3 years).
So it is really a matter of different beliefs. And I don't think we will be able to convince each other to switch belief.
Let's just watch what happens?
5 replies →
> 'proper abstraction'
I assume you're not talking about chatgpt4o, because in my experience it's absolutely dogshit at abstracting code meaningfully. Good at finding design patterns, sure, but if your AI don't understand how to state machine, I'm not sure how I'm supposed to use it.
It's great at writing tests and documentation though.
GPT-4o is at least 1 order of magnitude behind Claude 3.5 Sonnet in coding. I use latter.
2 replies →