Comment by recursivedoubts
9 hours ago
AI is incredibly dangerous because it can do the simple things very well, which prevents new programmers from learning the simple things ("Oh, I'll just have AI generate it") which then prevents them from learning the middlin' and harder and meta things at a visceral level.
I'm a CS teacher, so this is where I see a huge danger right now and I'm explicit with my students about it: you HAVE to write the code. You CAN'T let the machines write the code. Yes, they can write the code: you are a student, the code isn't hard yet. But you HAVE to write the code.
It’s like weightlifting: sure you can use a forklift to do it, but if the goal is to build up your own strength, using the forklift isn’t going to get you there.
This is the ultimate problem with AI in academia. We all inherently know that “no pain no gain” is true for physical tasks, but the same is true for learning. Struggling through the new concepts is essentially the point of it, not just the end result.
Of course this becomes a different thing outside of learning, where delivering results is more important in a workplace context. But even then you still need someone who does the high level thinking.
I think this is a pretty solid analogy but I look at the metaphor this way - people used to get strong naturally because they had to do physical labor. Because we invented things like the forklift we had to invent things like weightlifting to get strong instead. You can still get strong, you just need to be more deliberate about it. It doesn't mean shouldn't also use a forklift, which is its own distinct skill you also need to learn.
It's not a perfect analogy though because in this case it's more like automated driving - you should still learn to drive because the autodriver isn't perfect and you need to be ready to take the wheel, but that means deliberate, separate practice at learning to drive.
> people used to get strong naturally because they had to do physical labor
I think that's a bit of a myth. The Greeks and Romans had weightlifting and boxing gyms, but no forklifts. Many of the most renowned Romans in the original form of the Olympics and in Boxing were Roman Senators with the wealth and free time to lift weights and box and wrestle. One of the things that we know about the famous philosopher Plato was that Plato was essentially a nickname from wrestling (meaning "Broad") as a first career (somewhat like Dwayne "The Rock" Johnson, which adds a fun twist to reading Socratic Dialogs or thinking about relationships as "platonic").
Arguably the "meritocratic ideal" of the Gladiator arena was that even "blue collar" Romans could compete and maybe survive. But even the stories that survive of that, few did.
There may be a lesson in that myth, too, that the people that succeed in some sports often aren't the people doing physical labor because they must do physical labor (for a job), they are the ones intentionally practicing it in the ways to do well in sports.
3 replies →
Weightlifting and weight training was invented long before forklifts. Even levers were not properly understood back then.
My favorite historic example of typical modern hypertrophy-specific training is the training of Milo of Croton [1]. By legend, his father gifted him with the calf and asked daily "what is your calf, how does it do? bring it here to look at him" which Milo did. As calf's weight grew, so did Milo's strength.
This is application of external resistance (calf) and progressive overload (growing calf) principles at work.
[1] https://en.wikipedia.org/wiki/Milo_of_Croton
Milo lived before Archimedes.
7 replies →
>if the goal is to build up your own strength I think you missed this line. If the goal is just to move weights or lift the most - forklift away. If you want to learn to use a forklift, drive on and best of luck. But if you're trying to get stronger the forklift will not help that goal.
Like many educational tests the outcome is not the point - doing the work to get there is. If you're asked to code fizz buzz it's not because the teacher needs you to solve fizz buzz for them, it's because you will learn things while you make it. Ai, copying stack overflow, using someone's code from last year, it all solves the problem while missing the purpose of the exercise. You're not learning - and presumably that is your goal.
Thanks for the analogy. But I think students may think to themselves:"Why do I need to be stronger if I can use a forklift?"
I like this analogy along with the idea that "it's not an autonomous robot, it's a mech suit."
Here's the thing -- I don't care about "getting stronger." I want to make things, and now I can make bigger things WAY faster because I have a mech suit.
edit: and to stretch the analogy, I don't believe much is lost "intellectually" by my use of a mech suit, as long as I observe carefully. Me doing things by hand is probably overrated.
The point of going to school is to learn all the details of what goes into making things, so when you actually make a thing, you understand how it’s supposed to come together, including important details like correct design that can support the goal, etc. That’s the “getting stronger” part that you can’t skip if you expect to be successful. Only after you’ve done the work and understand the details can you be successful using the power tools to make things.
3 replies →
> Everyone knows that debugging is twice as hard as writing a program in the first place. So if you're as clever as you can be when you write it, how will you ever debug it? — The Elements of Programming Style, 2nd edition, chapter 2
If you weren't even "clever enough" to write the program yourself (or, more precisely, if you never cultivated a sufficiently deep knowledge of the tools & domain you were working with), how do you expect to fix it when things go wrong? Chatbots can do a lot, but they're ultimately just bots, and they get stuck & give up in ways that professionals cannot afford to. You do still need to develop domain knowledge and "get stronger" to keep pace with your product.
Big codebases decay and become difficult to work with very easily. In the hands-off vibe-coded projects I've seen, that rate of decay was extremely accelerated. I think it will prove easy for people to get over their skis with coding agents in the long run.
OK, it’s a mech suit. The question under discussion is, do you need to learn to walk first, before you climb into it? My life experience has shown me you can’t learn things by “observing”, only by doing.
If all I know is the mech suit, I’ll struggle with tasks that I can’t use it for. Maybe even get stuck completely. Now it’s a skill issue because I never got my 10k hours in and I don’t even know what to observe or how to explain the outcome I want.
In true HN fashion of trading analogies, it’s like starting out full powered in a game and then having it all taken away after the tutorial. You get full powered again at the end but not after being challenged along the way.
This makes the mech suit attractive to newcomers and non-programmers, but only because they see product in massively simplified terms. Because they don’t know what they don’t know.
This analogy works pretty well. Too much time doing everything in it and your muscles will atrophy. Some edge cases will be better if you jump out and use your hands.
7 replies →
The mech suit works well until you need to maintain stateful systems. I've found that while initial output is faster, the AI tends to introduce subtle concurrency bugs between Redis and Postgres that are a nightmare to debug later. You get the speed up front but end up paying for it with a fragile architecture.
> "it's not an autonomous robot, it's a mech suit."
Or "An [electric] bicycle for the mind." Steve Jobs/simonw
No, it's not a mech suit. A mech suit doesn't fire its canister rifle at friendly units and then say "You're absolutely right! I should have done an IFF before attacking that unit." (And if it did the engineer responsible should be drawn and quartered.) Mech-suit programming AI would look like something that reads your brainwaves and transduces them into text, letting you think your code into the machine. I'd totally use that if I had it.
> I want to make things
You need to be strong to do so. Things of any quality or value at least.
Misusing a forklift might injure the driver and a few others; but it is unlikely to bring down an entire electric grid, expose millions to fraud and theft, put innocent people in prison, or jeopardize the institutions of government.
There is more than one kind of leverage at play here.
> Misusing a forklift might injure the driver and a few others; but it is unlikely to bring down an entire electric grid
That's the job of the backhoe.
(this is a joke about how diggers have caused quite a lot of local internet outages by hitting cables, sometimes supposedly "redundant" cables that were routed in the same conduit. Hitting power infrastructure is rare but does happen)
1 reply →
> but it is unlikely to bring down an entire electric grid
Unless you happen to drive a forklift in a power plant.
> expose millions to fraud and theft
You can if you drive forklift in a bank.
> put innocent people in prison
You can use forklift to put several innocent people in prison with one trip, they have pretty high capacity.
> jeopardize the institutions of government.
It's pretty easy with a forklift, just try driving through main gate.
> There is more than one kind of leverage at play here.
Forklifts typically have several axes of travel.
1 reply →
I do appreciate the visual of driving a forklift into the gym.
The activity would train something, but it sure wouldn't be your ability to lift.
A version of this does happen with regard to fitness.
There are enthusiasts who will spend an absolute fortune to get a bike that is few grams lighter and then use it to ride up hills for the exercise.
Presumably a much cheaper bike would mean you could use a smaller hill for the same effect.
1 reply →
I feel like the aviation pilot angst captured by "automation dependency" and the fears around skills loss is another great analogy. [0]
[0] https://eazypilot.com/blog/automation-dependency-blessing-or...
How seriously do you mean the analogy?
I think forklifts probably carry more weight over longer distances than people do (though I could be wrong, 8 billion humans carrying small weights might add up).
Certainly forklifts have more weight * distance when you restrict to objects that are over 100 pounds, and that seems like a good decision.
I think it's a good analogy. A forklift is a useful tool and objectively better than humans for some tasks, but if you've never developed your muscles because you use the forklift every time you go to the gym, then when you need to carry a couch up the stairs you'll find that you can't do it and the forklift can't either.
So the idea is that you should learn to do things by hand first, and then use the powerful tools once you're knowledgeable enough to know when they make sense. If you start out with the powerful tools, then you'll never learn enough to take over when they fail.
2 replies →
I feel we are on the cusp of a new era... Civil Engineering bridge analogies about to be replaced by forklift analogies.
You're making the analogy work: because the point of weightlifting as a sport or exercise is to not to actually move the weights, but condition your body such that it can move the weights.
Indeed, usually after doing weightlifting, you return the weights to the place where you originally took them from, so I suppose that means you did no work at in the first place..
1 reply →
The real challenge will be that people almost always pick the easier path.
We have a decent sized piece of land and raise some animals. People think we're crazy for not having a tractor, but at the end of the day I would rather do it the hard way and stay in shape while also keeping a bit of a cap on how much I can change or tear up around here.
I've been showing my students this video of a robot lifting weights to illustrate why they shouldn't use AI to do their homework. It's obvious to them the robot lifting weights won't make them stronger.
https://www.youtube.com/watch?v=Be7WBGMo3Iw
I like the weightlifting parable!
Unlike weightlifting, the main goal of our jobs is not to lift heavy things, but develop a product that adds value to its users.
Unfortunately, many sdevs don't understand it.
Yes but the goal of school is to lift heavy things, basically. You're trying to do things that are difficult (for you) but don't produce anything useful for anyone else. That's how you gain the ability to do useful things.
6 replies →
I had my first interview last week where I finally saw this in the wild. It was a student applying for an internship. It was the strangest interview. They had excellent textbook knowledge. They could tell you the space and time complexities of any data structure, but they couldn't explain anything about code they'd written or how it worked. After many painful and confusing minutes of trying to get them to explain, like, literally anything about how this thing on their resume worked, they finally shrugged and said that "GenAI did most of it."
It was a bizarre disconnect having someone be both highly educated and yet crippled by not doing.
Sounds a little bit like the stories from Feynman, e.g.: https://enlightenedidiot.net/random/feynman-on-brazilian-edu...
The students had memorized everything, but understood nothing. Add in access to generative AI, and you have the situation that you had with your interview.
It's a good reminder that what we really do, as programmers or software engineers or what you wanna call it, is understanding how computers and computations work.
Lots of theory but no practice.
More like using a calculator but not being able to explain how to do the calculation by hand. A probabilistic calculator which is sometimes wrong at that. The "lots of theory but no practice" has always been true for a majority of graduates in my experience.
1 reply →
This the kind of interaction that makes be think that there are only 2 possible futures:
Star Trek or Idiocracy.
Hmmm, I think we're more likely to face an Idiocracy outcome. We need more Geordi La Forges out there, but we've got a lot of Fritos out here vibe coding the next Carl's Jr. locating app instead
Star Trek illustrated the issue nicely in the scene where Scotty, who we should remember is an engineer, tries to talk to a computer mouse in the 20th century: https://www.youtube.com/watch?v=hShY6xZWVGE
This is exactly the end state of hiring via Leetcode.
What you as a teacher teach might have to adapt a bit. Teaching how code works is more important than teaching how to code. Most academic computer scientists aren't necessarily very skilled as programmers in any case. At least, I learned most of that after I stopped being an academic myself (Ph. D. and all). This is OK. Learning to program is more of a side effect of studying computer science than it is a core goal (this is not always clearly understood).
A good analogy here is programming in assembler. Manually crafting programs at the machine code level was very common when I got my first computer in the 1980s. Especially for games. By the late 90s that had mostly disappeared. Games like Roller Coaster Tycoon were one of the last ones with huge commercial success that were coded like that. C/C++ took over and these days most game studios license an engine and then do a lot of work with languages like C# or LUA.
I never did any meaningful amount of assembler programming. It was mostly no longer a relevant skill by the time I studied computer science (94-99). I built an interpreter for an imaginary CPU at some point using a functional programming language in my second year. Our compiler course was taught by people like Eric Meyer (later worked on things like F# at MS) who just saw that as a great excuse to teach people functional programming instead. In hindsight, that was actually a good skill to have as functional programming interest heated up a lot about 10 years later.
The point of this analogy: compilers are important tools. It's more important to understand how they work than it is to be able to build one in assembler. You'll probably never do that. Most people never work on compilers. Nor do they build their own operating systems, databases, etc. But it helps to understand how they work. The point of teaching how compilers work is understanding how programming languages are created and what their limitations are.
> Teaching how code works is more important than teaching how to code.
People learn by doing. There's a reason that "do the textbook problems" is somewhat of a meme in the math and science fields - because that's the way that you learn those things.
I've met someone who said that when he get a textbook, he starts by only doing the problems, and skipping the chapter content entirely. Only when he has significant trouble with the problems (i.e. he's stuck on a single one for several hours) does he read the chapter text.
He's one of the smartest people I know.
This is because you learn by doing the problems. In the software field, that means coding.
Telling yourself that you could code up a solution is very different than actually being able to write the code.
And writing the code is how you build fluency and understanding as to how computers actually work.
> I never did any meaningful amount of assembler programming. It was mostly no longer a relevant skill by the time I studied computer science (94-99). I built an interpreter for an imaginary CPU at some point using a functional programming language in my second year.
Same thing for assembly. Note that you built an interpreter for an imaginary CPU - not a real one, as that would have been a much harder challenge given that you didn't do any meaningful amount of assembly program and didn't understand low-level computer hardware very well.
Obviously, this isn't to say that information about how a system works can't be learned without practice - just that that's substantially harder and takes much more time (probably 3-10x), and I can guarantee you that those doing vibecoding are not putting in that extra time.
> The point of this analogy: compilers are important tools. It's more important to understand how they work than it is to be able to build one in assembler. You'll probably never do that. Most people never work on compilers. Nor do they build their own operating systems, databases, etc. But it helps to understand how they work. The point of teaching how compilers work is understanding how programming languages are created and what their limitations are.
I don't know that it's all these things at once, but most people I know that are good have done a bunch of spikes / side projects that go a level lower than they have to. Intense curiosity is good, and to the point your making, most people don't really learn this stuff just by reading or doing flash cards. If you want to really learn how a compiler works, you probably do have to write a compiler. Not a full-on production ready compiler, but hands on keyboard typing and interacting with and troubleshooting code.
Or maybe to put another way, it's probably the "easiest" way, even though it's the "hardest" way. Or maybe it's the only way. Everything I know how to do well, I know how to do well from practice and repitition.
I only learn when I do things, not when I hear how they work. I think the teacher has the right idea.
A million percent! I was so bad at Math in school. Which I primarily blame on the arbitrary way in which we were taught it. It wasn't until I was able to apply it to solving actual problems that it clicked.
Yes, I do too, but the point they were trying to make is that "learning how to write code" is not the point of CS education, but only a side effect.
2 replies →
> A good analogy here is programming in assembler. Manually crafting programs at the machine code level was very common when I got my first computer in the 1980s. Especially for games. By the late 90s that had mostly disappeared.
Indeed, a lot of us looked with suspicion and disdain at people that used those primitive compilers that generated awful, slow code. I once spent ages hand-optimizing a component that had been written in C, and took great pleasure in the fact I could delete about every other line of disassembly...
When I wrote my first compiler a couple of years later, it was in assembler at first, and supported inline assembler so I could gradually convert to bootstrap it that way.
Because I couldn't imagine writing it in C, given the awful code the C compilers I had available generated (and how slow they were)...
These days most programmers don't know assembler, and increasingly don't know languaes as low level as C either.
And the world didn't fall apart.
People will complain that it is necessary for them to know the languages that will slowly be eaten away by LLMs, just like my generation argued it was absolutely necessary to know assembler if you wanted to be able to develop anything of substance.
I agree with you people should understand how things work, though, even if they don't know it well enough to build it from scratch.
> These days most programmers don't know assembler, and increasingly don't know languaes as low level as C either. And the world didn't fall apart.
Maybe the world didn't fall apart, but user interactions on a desktop pc feel slower than ever. So perhaps they should.
When I did a CS major, there was a semester of C, a semester of assembly, a semester of building a verilog CPU, etc. I’d be shocked if an optimal CS education involved vibecoding these courses to any significant
While I may not write assembler, there is still significant value in being able to read assembler e.g. godbolt.
I just think it's like hitting the snooze button.
Not only that, it's constitution. I'm finding this with myself. After vibe coding for a month or so I let my subscription expire. Now when I look at the code it's like "ugh you mean now I have to think about this with my own brain???"
Even while vibe-coding, I often found myself getting annoyed just having to explain things. The amount of patience I have for anything that doesn't "just work" the first time has drifted toward zero. If I can't get AI to do the right thing after three tries, "welp, I guess this project isn't getting finished!"
It's not just laziness, it's like AI eats away at your pride of ownership. You start a project all hyped about making it great, but after a few cycles of AI doing the work, it's easy to get sucked into, "whatever, just make it work". Or better yet, "pretend to make it work, so I can go do something else."
I remember reading about a metal shop class, where the instructor started out by giving each student a block of metal, and a file. The student had to file an end wrench out of the block. Upon successful completion, then the student would move on to learning about the machine tools.
The idea was to develop a feel for cutting metal, and to better understand what the machine tools were doing.
--
My wood shop teacher taught me how to use a hand plane. I could shave off wood with it that was so thin it was transparent. I could then join two boards together with a barely perceptible crack between them. The jointer couldn't do it that well.
Also, in college, I'd follow the derivation that the prof did on the chalkboard, and think I understood it. Then, doing the homework, I'd realize I didn't understand it at all. Doing the homework myself was where the real learning occurred.
This concept can be taken to ridiculous extremes, where learning the actual useful skill takes too long for most participants to get to. For example, the shop class teacher taking his students out into the wilderness to prospect for ore, then building their own smelter, then making their own alloy, then forging billet, etc.
In middle school (I think) we spent a few days in math class hand-calculating trigonometry values (cosine, sin, etc.). Only after we did that did our teacher tell us that the mandated calculators that we all have used for the last few months have a magic button that will "solve" for the values for you. It definitely made me appreciate the calculator more!
When learning basic math, you shouldn't use a calculator, because otherwise you aren't really understanding how it works. Later, when learning advanced math, you can use calculators, because you're focusing on a different abstraction level. I see the two situations as very similar.
Same with essay assignments, you exercise different neural pathways by doing it yourself.
Recently in comments people were claiming that working with LLMs has sharpened their ability to organize thoughts, and that could be a real effect that would be interesting to study. It could be that watching an LLM organize a topic could provide a useful example of how to approach organizing your own thoughts.
But until you do it unassisted you haven’t learned how to do it.
The natural solution is right there in front of us but we hate to admit it because it still involves LLMs and changes on the teaching side. Just raise the bar until they struggle.
LLMs are not bicycles for the mind. They are more like E-bikes. More assist makes you go faster, but provides less exercise.
https://www.slater.dev/2025/08/llms-are-not-bicycles-for-the...
I haven't done long division in decades, am probably unable to do it anymore, and yet it has never held me back in any tangible fashion (and won't unless computers and calculators stop existing)
That makes sense. Some skills just have more utility than others. There are skills that are universally relevant (e.g. general problem solving), and then there are skills that are only relevant in a specific time period or a specific context.
With how rapidly the world has been changing lately, it has become difficult to estimate which of those more specific skills will remain relevant for how long.
I am rather positive that if you were sat down in a room and couldn't leave unless you did some mildly complicated long division, you would succeed. Just because it isn't a natural thing anymore and you have not done the drills in decades doesn't mean the knowledge is completely lost.
If you are concerned that embedding "from first-principles" reasoning in widely-available LLM's may create future generations that cannot, then I share your concern. I also think it may be overrated. Plenty of people "do division" without quite understanding how it all works (unfortunately).
And plenty of people will still come along who love to code despite AI's excelling at it. In fact, calling out the AI on bad design or errors seems to be the new "code golf".
They don't always do the simple things well which is even more frustrating.
I do Windows development and GDI stuff still confuses me. I'm talking about memory DC, compatible DC, DIB, DDB, DIBSECTION, bitblt, setdibits, etc... AIs also suck at this stuff. I'll ask for help with a relatively straightforward task and it almost always produces code that when you ask it to defend the choices it made, it finds problems, apologizes, and goes in circles. One AI (I forget which) actually told me I should refer to Petzold's Windows Programming book because it was unable to help me further.
I'd prefer it to tell me it can't help me rather than write random code that I then have to spend time debugging.
I see junior devs hyping vibe coding and senior devs mostly using AI as an assistant. I fall in the latter camp myself.
I've hired and trained tons of junior devs out of university. They become 20x productive after a year of experience. I think vibe coding is getting new devs to 5x productivity, which seems amazing, but then they get stuck there because they're not learning. So after year one, they're a 5x developer, not a 20x developer like they should be.
I have some young friends who are 1-3 years into software careers I'm surprised by how little they know.
If I find myself writing code in a way that has me saying to myself "there has to be a better way," there usually is. That's when I could present AI with that little bit of what I want to write. What I've found to be important is to describe what I want in natural language. That's when AI might introduce me to a better way of doing things. At that point, I stop and learn all that I can about what the AI showed me. I look it up in books and trusted online tutorials to make sure it is the proper way to do it.
"Why think when AI do trick?" is an extremely alluring hole to jump headfirst into. Life is stressful, we're short on time, and we have obligations screaming in our ear like a crying baby. It seems appropriate to slip the ring of power onto your finger to deal with the immediate situation. Once you've put it on once, there is less mental friction to putting it on the next time. Over time, gently, overuse leads to the wearer cognitively deteriorating into a Gollum.
> "Why think when AI do trick?"
> grug once again catch grug slowly reaching for club, but grug stay calm
I agree 100%. But as someone with 25 years of development experience, holy crap it's nice not having to do the boring parts as much anymore.
But what has changed? Students never had a natural reason to learn how to write fizz buzz. It's been done before and its not even useful. There has always been a arbitrary nature to these exercises.
I actually fear more for the middle-of-career dev who has shunned AI as worthless. It's easier than ever for juniors to learn and be productive.
Yes! You are best served by learning what a tool is doing for you by doing it yourself or carefully studying what it uses and obfuscates from you before using the tool. You don't need to construct an entire functioning processor in an HDL, but understanding the basics of digital logic and computer architecture matters if you're EE/CompE. You don't have to write an OS in asm, but understanding assembly and how it gets translated into binary and understanding the basics of resource management, IPC, file systems, etc. is essential if you will ever work in something lower level. If you're a CS major, algorithms and data structures are essential. If you're just learning front end development on your own or in a boot camp, you need to learn HTML and the DOM, events, how CSS works, and some of the core concepts of JS, not just React. You'll be better for it when the tools fail you or a new tool comes along.
Lots of interesting ways to spin this. I was in a computer science course in the late 90s and we were not allowed to use the C++ standard library because it made you a "lazy programmer" according to the instructor. I'm not sure if I agree with that, but the way that I look at it is that computer science all about abstraction, and it seems to me that AI, generative pair programming, vibe coding or what ever you want to call it is just another level of abstraction. I think what is probably more important is to learn what are and are not good programming and project structures and use AI to abstract the boilerplate,. scaffolding, etc so that you can avoid foot guns early on in your development cycle.
The counterargument here is that there is a distinction between an arbitrary line in the sand (C++ stdlb is bad) and using a text-generating machine to perform work for you, beginning to end. You are correct that as a responsibly used tool, LLMs offer exceptional utility and value. Though keep in sight the laziness of humans who focus on the immediate end result over the long-term consequences.
It's the difference between the employee who copy-pastes all of their email bodies from ChatGPT versus the one who writes a full draft themselves and then asks an LLM for constructive feedback. One develops skills while the other atrophies.
That's why it's so important to teach how to use them properly instead of demonizing them. Let's be realistic, they are not going to disappear and students and workers are not stopping using them.
When in school the point is often to learn how to write complex code by writing things the standard library does.
Though also in the 90's the standard library was new and often had bugs
It doesn't PREVENT them from learning anything - said properly, it lets developers become lazy and miss important learning opportunities. That's not AIs fault.
I'm not so sure. I spent A LOT of time writing sorting algo code by hand in university. I spent so much time writing assembly code by hand. So much more time writing instructions for MIPS by hand. (To be fair I did study EE not CS)
I learned more about programming in a weekend badly copying hack modules for Minecraft than I learned in 5+ years in university.
All that stuff I did by hand back then I haven't used it a single time after.
I would interpret his take a little bit differently.
You write sorting algorithms in college to understand how they work. Understand why they are faster because it teaches you a mental model for data traversal strategies. In the real world, you will use pre-written versions of those algorithms in any language but you understand them enough to know what to select in a given situation based on the type of data. This especially comes into play when creating indexes for databases.
What I take the OPs statement to mean are around "meta" items revolved more around learning abstractions. You write certain patterns by hand enough times, you will see the overlap and opportunity to refactor or create an abstraction that can be used more effectively in your codebase.
If you vibe code all of that stuff, you don't feel the repetition as much. You don't work through the abstractions and object relationships yourself to see the opportunity to understand why and how it could be improved.
You didn't write sorting code or assembly code because you were going to need to write it on the job. It gave you a grounding for how datastructures and computers work on a fundamental level. That intuition is what makes picking up minecraft hack mods much easier.
That's the koolaid, but seriously I don't really believe it anymore.
I only had to do this leg work during university to prove that I can be allowed to try and write code for a living. The grounding as you call it is not required for that at all,since im a dozen levels of abstraction removed from it. It might be useful if I was a researcher or would work on optimizing complex cutting edge stuff, but 99% of what I do is CRUD apps and REST Apis. That stuff can safely be done by anyone, no need for a degree. Tbf I'm from Germany so in other places they might allow you to do this job without a degree
1 reply →
I was so lucky to land in a CS class where we were writing C++ by hand. I don't think that exists anymore, but it is where I would go in terms of teaching CS from first principles
Sure (knowing the underlying ideas and having proficiency in their application) - but producing software by conducting(?) LLMs is rapidly becoming a wide, deep and must-have skill and the lack thereof will be a weakness in any student entering the workplace.
AI does have an incredibly powerful influence on learning. It can absolutely be used as a detriment, but it can also be just as powerful of a learning tool. It all comes down to keeping the student in the zone of proximal development.
If AI is used by the student to get the task done as fast as possible the student will miss out on all the learning (too easy).
If no AI is used at all, students can get stuck for long periods of time on either due to mismatches between instructional design and the specific learning context (missing prereq) or by mistakes in instructional design.
AI has the potential to keep all learners within an ideal difficulty for optimal rate of learning so that students learn faster. We just shouldn't be using AI tools for productivity in the learning context, and we need more AI tools designed for optimizing learning ramps.
Similarly, it's always been the case that copy-pasting code out of a tutorial doesn't teach you as much much as manually typing it out, even if you don't change it. That part of the problem isn't even new.
Yea, I doubt I could learn to program today if I started today.
Completely disagree. It’s like telling typists that they need to hand write to truly understand their craft. Syntax is just a way of communicating a concept to the machine. We now have a new (and admitidly imperfect) way of doing that. New skills are going to be required. Computer science is going to have to adapt.
As a teacher, do you have any techniques to make sure students learn to write the code?
In-person analog checkpoints seem to be the most effective method. Think internet-disabled PCs managed by the school, written exams, oral exams, and so forth.
Making students fix LLM-generated code until they're at their wits' end is a fun idea. Though it likely carries too high of an opportunity cost education-wise.
If I was a prof, I would make it clear to the students that they won't learn to program if they use AI to do it for them. For the students who wanted to learn, great! For those who just wanted to slide through with AI, I wouldn't care about them.
I'm an external examiner for CS students in Denmark and I disagree with you. What we need in the industry is software engineers who can think for themselves, can interact with the business and understand it's needs, and, they need to know how computers work. What we get are mass produced coders who have been taught some outdated way of designing and building software that we need to hammer out of them. I don't particularily care if people can write code like they work at the assembly line. I care that they can identify bottlenecks and solve them. That they can deliver business value quickly. That they will know when to do abstractions (which is almost never). Hell, I'd even like developers who will know when the code quality doesn't matter because shitty code will cost $2 a year but every hour they spend on it is $100-200.
Your curriculum may be different than it is around here, but here it's frankly the same stuff I was taught 30 years ago. Except most of the actual computer science parts are gone, replaced with even more OOP, design pattern bullshit.
That being said. I have no idea how you'd actually go about teaching students CS these days, considering a lot of them will probably use ChatGPT or Claude regardless of what you do. That is what I see in the statistic for grades around here. For the first 9 years I was a well calibrated grader, but these past 1,5ish years it's usually either top marks or bottom marks with nothing in between. Which puts me outside where I should be, but it matches the statistical calibration for everyone here. I obviously only see the product of CS educations, but even though I'm old, I can imagine how many corners I would have cut myself if I had LLM's available back then. Not to mention all the distractions the internet has brought.
> I don't particularily care if people can write code like they work at the assembly line. I care [...] That they can deliver business value quickly.
In my experience, people who talk about business value expect people to code like they work at the assembly line. Churn out features, no disturbances, no worrying about code quality, abstractions, bla bla.
To me, your comment reads contradictory. You want initiative, and you also don't want initiative. I presume you want it when it's good and don't want it when it's bad, and if possible the people should be clairvoyant and see the future so they can tell which is which.
I think we very often confuse engineers with scientists in this field. Think of the old joke: “anyone can build a bridge, it takes an Engineer to build one that barely stands”. Business value and the goal of engineering is to make a bridge that is fast to build, cheap to make, and stays standing exactly as long as it needs to. This is very different from the goals of science which are to test the absolute limits of known performance.
What I read from GP is that they’re looking for engineering innovation, not new science. I don’t see it as contradictory at all.
1 reply →
You should worry about code quality, but you should also worry about the return on investment.
That includes understanding risk management and knowing what the risks and costs are of failures vs. the costs of delivering higher quality.
Engineering is about making the right tradeoffs given the constraints set, not about building the best possible product separate from the constraints.
Sometimes those constraints requires extreme quality, because it includes things like "this should never, ever fail", but most of the time it does not.
Some of our code is of high quality. Other can be of any quality as it'll never need to be altered in it's lifecycle. If we have 20000 financial reports which needs to be uploaded once, and then it'll never happen again, it really doesn't matter how terrible the code is as long as it only uses vetted external dependencies. The only reason you'd even use developer time on that task is because it's less errorprone than having student interns do it manually... I mean, I wish I could tell you it was to save them from a terrible task, but it'll solely be because of money.
If it's firmware for a solar inverter in Poland, then quality matters.
> people who talk about business value expect people to code like they work at the assembly line. Churn out features, no disturbances, no worrying about code quality, abstractions, bla bla.
That's typical misconception that "I'm an artist, let me rewrite in Rust" people often have. Code quality has a direct money equivalent, you just need to be able to justify it for people that pay you salary.
Let them use AI and then fall on their faces during exam time - simple as that. If you can't recall the theory, paradigm, methodology, whatever by memory then you have not "mastered" the content and thus, should fail the class.
> That they will know when to do abstractions
The only way to learn when abstractions are needed is to write code, hit a dead end, then try and abstract it. Over and over. With time, you will be able to start seeing these before you write code.
AI does not do abstractions well. From my experience, it completely fails to abstract anything unless you tell it to. Even when similar abstractions are already present. If you never learn when an abstraction is needed, how can you guide an AI to do the same well?
> That being said. I have no idea how you'd actually go about teaching students CS these days, considering a lot of them will probably use ChatGPT or Claude regardless of what you do.
My son is in a CS school in France. They have finals with pen and paper, with no computer whatsoever during the exam; if they can't do that they fail. And these aren't multiple choice questions, but actual code that they have to write.
I had to do that too, in Norway. Writing C++ code with pen and paper and being told even trivial syntax errors like missing semicolons would be penalised was not fun.
This was 30 years ago, though - no idea what it is like now. It didn't feel very meaningful even then.
But there's a vast chasm between that and letting people use AI in an exam setting. Some middle ground would be nice.
I wrote code in a spiral notebook because the mainframe was not available to me at home.
3 replies →
> I'm an external examiner for CS students
> Hell, I'd even like developers who will know when the code quality doesn't matter because shitty code will cost $2 a year but every hour they spend on it is $100-200.
> Except most of the actual computer science parts are gone, replaced with even more OOP, design pattern bullshit.
Maybe you should consider a different career, you sound pretty burnt out. There are terrible takes, especially for someone who is supposed to be fostering the next generation of developers.
I don't foster the next generations. I hire them. External examiners are people in the industry who are used as examiners to try and match educations with the needs of the industry.
It can take some people a few years to get over OOP, in the same way that some kids still believe in Santa a bit longer. Keep at it though and you’ll make it there eventually too.
Ah, see, you're outside of the US.
In the US education has been bastardized into "job training"
Good workers don't really need to think in this paradigm.
What is an "external examiner?"
External examiners are people in the industry who are used as examiners to try and match educations with the needs of the industry.
A proctor?