Comment by tedious-coder
6 days ago
AI makes me sad. When I started my CS degree, I didn't even know what silicon valley was. I was unaware of what the SWE job landscape was like. I went to school in a no-name town.
Computer science was an immensely fun subject to learn. I moved to one of the big cities and was bewildered with how much there was to learn, and loved every second of it. I gradually became good enough to help anyone with almost anything, and spent lots of my free time digging deeper and learning.
I liked CS and programming - but I did not like products built by the companies where I was good enough to be employed. These were just unfortunate annoyances that allowed me to work close enough to what I actually enjoyed, which was just code, and the computer.
Before LLMs, those like me could find a place within most companies - the person you don't go to for fast features, but for weird bugs or other things that the more product-minded people weren't interested in. There was still, however, an uncomfortable tension. And now that tension is even greater. I do not use an LLM to write all my code, because I enjoy doing things myself. If I do not have that joy, then it will be immensely difficult for me to continue the career I have already invested so much time in. If I could go back in time and choose another field I would - but since that's not possible, I don't understand why it's so hard for people to have empathy for people like me. I would never have gone down this path if I knew that one day, my hard-earned-knowledge would become so much less valuable, and I'd be forced to delegate the only part of the job I enjoyed to the computer itself.
So Thomas, maybe your AI skeptic friends aren't nuts, they just have different priorities. I realize that my priorities are at odds for the companies I work for. I am just tightly gripping the last days that I can get by doing this job the way that I enjoy doing it.
I recommend reframing this.
LLMs don't make your hard-earned-knowledge less valuable: they make it more valuable.
You are better qualified to use them to build great software than people who don't have your level of experience and software engineering domain expertise.
If you don't want to do that then I guess you can find another career - but if you switch careers because you incorrectly think that LLMs make programming experience less valuable you would be making a big mistake in my opinion.
I agree with your assessment of the value of the skills, at least for the time being. What I dislike is the way that we are being encouraged to work now. I simply do not find any joy, at all, in reviewing LLM-written code and then typing in the next prompt.
A sentiment I see often is that it's work, it's not supposed to be fun, and you work at the pleasure of the employer. And I accept that. But I still am really just crushingly sad that this is what my job is becoming.
In the article, Thomas wrote:
> LLMs can write a large fraction of all the tedious code you’ll ever need to write. And most code on most projects is tedious. LLMs drastically reduce the number of things you’ll ever need to Google. They look things up themselves. Most importantly, they don’t get tired; they’re immune to inertia.
I see this as a massive downside, because I loved writing tedious code. I loved reading docs on something I previously didn't understand. I loved forming the mental models strong enough to say "yeah I see why that's there" in the previously-inscrutable APIs of the frameworks and such that I was using. It was precisely the _way_ that I approached that work that allowed for that knowledge to accrue. It was because I almost never just copy/pasted something without spending a lot of time to understand it.
I do some of the same with ChatGPT. I type the code in myself after trying to internalize the ChatGPT response. But even that is starting to feel like company time-theft, as the attitude is shifting even further away from "knowing how to do things is good" toward "getting shit done is all that matters."
> You are better qualified to use them to build great software than people who don't have your level of experience and software engineering domain expertise
Since a big majority of companies stopped hiring juniors, where is the new blood coming from when the inevitable more seniors retire?
[dead]
I think the important thing here is you're being honest about how you're feeling. You bring up a very real anxiety and possibility and even folks who are using LLMs probably feel some degree of this alienation. That LLMs are yet another tool to push us to move as fast as possible rather than have our brains get into the nooks and crannies of hard problems that may take longer but are more rewarding to us.
But again, you're being honest. The problem with a lot of the AI skeptic arguments I see is a lack of this honesty. Others have noted that there are a lot of contradictory skeptical arguments, and I suspect the contradictions come because the authors have negative emotions about AI which they're using to create negative arguments.
I do fall into this category of people that are seen as heavily abusing copium. I can admit that when I do get unsatisfactory results from a prompt session, a lot of it has to do with the mental friction I feel at the idea of letting something else write my code.
It again is coming back to my opinion the LLMs have recreated the job in such a way that it emphasizes what I disliked most, and de-emphasizing what I liked. It emphasizes "the goal" and de-emphasizes "the process". We had a period in the 10's where the process (namely, becoming adept at using and learning an ever-changing set of open source tools) was a bit more celebrated. You could justify a lunch-and-learn on things like man pages, commit isolation levels, or package manager - and doing something like that would be seen in a positive light. And now, why would you waste everyone's time talking about something that ChatGPT can figure out for you?
Anyway, thanks for your time in your response.