Comment by slibhb
12 hours ago
I think this concern is overblown. AI is an incredible teaching tool. It's probably better for teaching/explaining than for writing code. This will make the next generation of junior devs far more effective than previous generations. Not because they're skipping the fundamentals...because they have a better grasp of the fundamentals due to back-and-forth with infinitely patient AI teachers.
Not in my experience. They just regurgitate code, and juniors don’t know if/why it’s good or bad and consequently can’t field questions on their PR.
“It’s what the LLM said.” - Great. Now go learn it and do it again yourself.
Unless your company is investing in actually teaching your junior devs, this isn't really all that different than the days when jr devs just copied and pasted something out of stack overflow, or blindly copied entire class files around just to change 1 line in what could otherwise have been a shared method. And if your company is actually investing time and resources into teaching your junior devs, then whether they're copying and pasting from stack overflow, from another file in the project or from AI doesn't really matter.
In my experience it is the very rare junior dev that can learn what's good or bad about a given design on their own. Either they needed to be paired with a sr dev to look at things and explain why they might not want to something a given way, or they needed to wind up having to fix the mess they made when their code breaks something. AI doesn't change that.
I always say "own the output". No need to do it by hand but you better damn well research _why_ the AI chose a solution, and what alternatives there are and why not something else, how it works and so on. Ask the AI, ask a seperate agent/model, Google for it, I don't care, but "I don't know the LLM told me" is not acceptable.
For me, the hardest part of software development was learning incantations. These are not interesting, they're conventions that you have to learn to get stuff to work. AI makes this process easier.
If people use AI to generate code they don't understand, that will bite them. But it's an incredibly tool for explaining code and teaching you boring, rote incantations.
This just means you have bad juniors who aren’t interested in learning.
It's easier to be lazy now more than ever. Hard to blame them because the temptation to deliver and prove oneself as a junior is always high.
I can't count how many seniors have forgotten what it means to understand the code they're merging since AI coding tools became popular. So long as businesses only value quantity the odds are stacked against juniors.
>AI is an incredible teaching tool.
As a junior, my top issue is finding valuable learning material that isn't full of poor or outright wrong information.
In the best and most generous interpretation of your statement, LLM's simply removed my need to search for the information. That doesn't mean it's not of poor quality or outright wrong.
I suspect that the quality is ironically correlated with the expertise of the user (i.e. it is knowledgeable if you are knowledgeable), which puts you in a conundrum (I can report that with a couple decades of experience, LLMs are giving me high quality, correct results, but I can already see that it somehow doesn't work as well for some of my less experienced colleagues. A lot of what I've been doing over the last couple months is trying to find how to make it "just work" for them.).
As a general principle, take advantage of the fact that it can easily generate stuff. If you don't know whether something is true, have it prove it. Make a PoC/test/benchmark to demonstrate what it's saying. Have it pull metrics that you have access to. Add more observability. Create feedback loops (or rather, ask it to create feedback loops). They're very good at reasoning given access to the ground truth, so give them more ability to ground themselves.
They also have fantastic knowledge of public things, but no knowledge of your company, so your instructions should mostly be documentation of what's unique to your company. If it can write an instruction on its own (e.g. how to use git or kubernetes), it is a useless instruction; it already knows that. What it doesn't know is e.g. where your git server is. It also doesn't know what matters to your company: are you a startup trying to find product market fit? Are you an established company that is not allowed to break customer setups? etc. You might even be able to ask it what kinds of questions a senior might ask about how a company/team works when coming into a new job, and then see if you can answer those questions (or find someone who can). In fact, go ask chatgpt:
> What are some questions a senior engineer might ask when coming into a new role to make themselves more effective?
> What are some questions a principle engineer might ask when coming into a new role to make themselves more effective?
> What are some questions an engineering manager might ask when coming into a new role to make themselves more effective?
> What are some questions an engineering director might ask when coming into a new role to make themselves more effective?
Here's a tip from an old timer: read the official docs.
I work a lot with juniors, and they all seem to prefer watching video's. But videos in my opinion are a slow way to gain superficial knowledge.
Do it the hard way and read the official docs, it will be your superpower. Go fast over the easy parts, go slow over the hard parts, it's that simple.
Research [0] from Anthropic about juniors learning to code with AI/without:
>the AI group averaged 50% on the quiz, compared to 67% in the hand-coding group
And why would they do better? There's less incentive to learn because it's so easy to offload thinking to AI.
[0] https://www.anthropic.com/research/AI-assistance-coding-skil...
Objectively speaking, students that use AI score more than a full grade point below their peers not using AI.
AI makes students dumber, not smarter.
This is the dumbest thought that proliferates this website.
Super great that it’s used to pump out tons of code because upper management wants features released even faster than before. I’m sure the junior devs who don’t know a for loop from their ass will be able to learn and understand wtf Claude is shitting out
> AI is an incredible teaching tool. It's probably better for teaching/explaining than for writing code.
It is but how do you teach to people who think their new profession is being a "senior prompt engineer" (with 4 months of experience) and who believe that in 12 months there won't be any programmer left?
A teacher who just gives you the solution isn’t a good teacher.
You can use AI as a teacher but how many will do that?
Highly motivated people will use whatever tools they have to get better at something, whether they have a textbook, the internet or a LLM to use.
The skill of the very top programmers will continue to increase with the advent of new tools.
And how many will not? For mist people it’s just a job to get money, they will put exactly as much effort in it as is necessary to produce an acceptable result
Only for people who wants be taught, this argument keeps coming up again and again but people in general doesn’t want to learn how to fish, they want the fish on a plate ready to eat, so that they can continue scrolling. I see this a lot in juniors, they are solution seekers, not problem solvers, and AI makes this difference a lot worse.
I do agree it’s a great tool, so much better than trying to hope and pray someone on the internet can help you with “I don’t understand this line of code.”
However, it’s got a lot of downsides too.