Comment by alganet
13 hours ago
Think of it this way: if your problem can be solved by an LLM with the same quality, then it's not a problem worthy of a human to tackle. It probably never was in the first place, we just didn't knew.
The only exception here is learning (solving a solved problem so you can internalize it).
There are tons of problems that LLMs can't tackle. I chose two of those (polyglot programs, already worked on them before AI) and bootstrapping from source (AI can't even understand what the problem is). The progress I can get on those areas is not improved by using LLMs, and it feels good. I am sure there are many more of such problems out there.
I actually agree with everything you said, and I see I failed to communicate my idea that's exactly why I'm so upset.
You said "the only exception here is learning" - and that exception was my hobby. Programming simple things wasn't work for me. It was entertainment. It was what I did for fun on weekends.
Reading a blog post about writing a toy database or a parser combinator library and then spending a Saturday afternoon implementing it myself. that was like going to an amusement park. It was a few hours of enjoyable, bounded exploration. I could follow my curiosity, learn something new, and have fun doing it.
And you're right: if an LLM can solve it with the same quality, it's not a problem worthy of human effort. I agree with that logic. I've internalized it from years in the industry, from working with AI, from learning to make decisions about what to spend time on.
But here's what's been lost: that logic has closed the amusement park. All those simple, fun learning projects now feel stupid. When I see those blog posts now, my gut reaction is "why would I waste time on that? That's one prompt away." The feeling that it's "not worthy" has completely drained the joy out of it.
I can't turn off that instinct anymore. I know those 200 lines of code are trivial. I know AI can generate them. And so doing it myself feels like I'm deliberately choosing to be inefficient, like I'm LARPing at being a programmer instead of actually learning something valuable.
The problem isn't that I disagree with you. The problem is that I agree with you so completely that I can no longer have fun. The only "worthy" problems left are the hard ones AI can't do. But those require months of serious investment, not a casual Saturday afternoon.
> The feeling that it's "not worthy" has completely drained the joy out of it.
It was never "worthy". With the proliferation of free, quality, open source software, what's now a prompt away, has been a github repo away for a long time. It's just that, before, you chose to ignore the existence of github repos and enjoy your hobby. Now you're choosing to not ignore the AI.
> And so doing it myself feels like I'm deliberately choosing to be inefficient
People have plenty of hobbies that are not the most "efficient" way to solve a problem. There are planes, but some people ride bikes across continents. Some walk.
LLMs exist, you can choose to what level you use them. Maybe you need to detox for a weekend or two.
I genuinely do not understand this. You can totally still do that for learning purposes.
The only thing you cannot do anymore is show off such projects. The portfolio of mini-tutorials is definitely a bygone concept. I actually like that part of how the culture has changed.
Another interesting challenge is to set yourself up to outperform the LLM. Golf with it. LLM can do a parser? Okay, I'll make a faster one instead. Less lines of code. There's tons of learning opportunities in that.
> The only "worthy" problems left are the hard ones
That's not true. There are also unexplored problems which the AI doesn't have enough training data to be useful.