Comment by hiAndrewQuinn
6 days ago
I'll take the opposite view of most people. Expertise is a bad thing. We should embrace technological changes that render expertise economically irrelevant with open arms.
Take a domain like US taxation. You can certainly become an expert in that, and many people do. Is it a good thing that US taxes are so complicated that we have a market demand for thousands of such experts? Most people would say no.
Don't get my wronf, I've been coding for more years of being alive than I haven't by this point, I love the craft. I still think younger me would have far preferred a world where he could have just had GPT do it all for him so he didn't need to spend his lunch hours poring over the finer points of e.g. Python iterators.
By the same logic we should allow anyone with an LLM to design ships, bridges, and airliners.
Clearly, it would be very unwise to buy a bridge designed by an LLM.
It's part of a more general problem - the engineering expectations for software development are much lower than for other professions. If your AAA game crashes, people get annoyed but no one dies. If your air traffic control system fails, you - and a large number of other poeple - are going to have a bad day.
The industry that has a kind of glib unseriousness about engineering quality - not theoretical quality, based on rules of thumb like DRY or faddy practices, but measurable reliability metrics.
The concept of reliability metrics doesn't even figure in the LLM conversation.
That's a very bizarre place to be.
> We should embrace technological changes that render expertise economically irrelevant with open arms.
To use your example, is using AI to file your taxes actually "rendering [tax] expertise economically irrelevant?" Or is it just papering over the over-complicated tax system?
From the perspective of someone with access to the AI tool, you've somewhat eased the burden. But you haven't actually solved the underlying problem (with the actual solution obviously being a simpler tax code). You have, on the other hand, added an extra dependency on top of an already over-complicated system.
In addition, a substantial portion of the complexity in software is essential complexity, not just accidental complexity that could be done away with.
This. And most of the time the code isn't that complex either. The complexity of a software product often isn't in the code, it's in the solution as a whole, the why's of each decision, not the how.
However whenever I've faced actual hard tasks, things that require going off the beaten path the AI trains on, I've found it severely lacking, no matter how much or little context I give it, no matter how many new chats I make, it just won't veer into truly new territory.
I never said anything about using AI to do your taxes.
I was drawing an analogy. We would probably be better off with a tax system that wasn't so complicated it creates its own specialized workforce. Similarly we would be better off with programming tools that make the task so simple that professional computer programmers feel like a 20th century anachronism. It might not be what we personally want as people who work in the field, but it's for the best.
> I never said anything about using AI to do your taxes. I was drawing an analogy.
Yeah, I was using your analogy.
> It might not be what we personally want as people who work in the field, but it's for the best.
You're inventing a narrative and borderline making a strawman argument. I said nothing about what people who work in the field "personally want." I'm talking about complexity.
> Similarly we would be better off with programming tools that make the task so simple that professional computer programmers feel like a 20th century anachronism.
My point is that if the "tools that make the task simple" don't actually simplify what's happening in the background, but rather paper over it with additional complexity, then no, we would not "be better off" with that situation. An individual with access to an AI tool might feel that he's better off; anyone without access to those tools (now or in the future) would be screwed, and the underlying complexity may still create other (possibly unforeseen) problems as that ecosystem grows.
The question then becomes whether or not it's possible (or will be possible) to effectively use these LLMs for coding without already being an expert. Right now, building anything remotely complicated with an LLM, without scouring over every line of code generated, is not possible.
Counter-counter point. The existence of tools like this can allow the tax code to become even more complex.
Nowhere do I suggest using AI to do your taxes. My point was, if you think it's bad taxes are complicated enough that many people need to hire a professional to do it, you should also think it's bad programming is complicated enough that many people need to hire a professional to do it.
I mean, we already have vibe tariffs, so vibe taxation isn’t far off. ;)
Don't think of it from someone who had to learn. Think of it from someone who has never had the experience the friction of learning at all.
But that is incompatible with the fact that you need be an expert to wield this tool effectively.