Comment by bradley13

7 days ago

"The professors use AI themselves and expect their students to use it, but do not have a gauge for what kinds of problems make for an appropriately difficult assignment in the modern era."

I'm a prof, recently retired but still teaching part-time. This is exactly the problem. AI is here, people use it, so it would be stupid (plus impossible) not to let students use it. However, you want your students to still learn something about CS, not just about how to prompt an AI.

The problem we are heading towards as an industry is obvious: AI is perfectly capable of doing most of the work of junior-level developers. However, we still need senior-level developers. Where are those senior devs going to come from, if we don't have roles for junior devs?

Not just that. As a 31 year old developer even I feel like acquiring new skills is now harder than ever. Having Claude come up with good solutions to problems feels fast, but I don't learn anything by doing so. Like it took me weeks to understand what good and what bad CMake code looks like. This made me the Cmake guy at work. The learning curve delayed the port from qmake to CMake quite a bit, but I learned a new skill.

  • Claude has a teacher mode where it will ask you questions.

    I’m picking up game dev in my spare time. I’m not letting Claude write any of the code. We talk through the next task, I take a run at it, then when I’m stuck I got back and talk through where the problems are.

    It’s slower than just letting Claude do it, obviously. Plus you do need to be a bit disciplined - Claude will gladly do it for you when you start getting tired. I am picking it up through, and not getting bogged down in the beginner ‘impossible feeling bugs you can’t figure out bc you’re learning and don’t fully understand everything yet’ stage.

  • what i find interesting about your perspective is your subjective perception of difficulty. nobody short of a savant is going to pick up a new language instantly. weeks (if not months) to learn a language is completely normal outside of this hyper exaggerated atmosphere we find ourselves in. that being said, language models do atrophy the brain when used in excess, and they do encourage surface level understanding, so i agree wholeheartedly with the idea of not learning anything at all by using them.

  • I’m 37 and have coded my entire life. I even got to pull the drop out of college and do star up and make money type thing before I took my current position.. I have to say AI has sucked the heart and soul out of coding.. Like it’s the most boring thing having to sit and prompt… Not to mention the slop, nonsense hype etc.. Never attach your identity to your job or a skill. Many of us do that just to be humbled when a new advancement occurs… I know I see programming and looking at Open Source code to contribute and all of it…. Is just lifeless. Literally and figuratively. Sorry for long rant I needed to vent.

    • You and me both :-(

      I see open source projects entirely run by clueless LLM-using idiots, and existing projects overrun by them, and there is none of the quality or passion you would normally see.

      Even if I were to apply my skill/energy to a project of my own, my code would just get stolen by these LLM companies to train their models, and regurgitated with my license removed. What's the point?

  • I can't imagine that you aren't learning a lot in just getting the AI to do what you need it to do. I haven't learned as much as I have in the last 6 months in a long time, including grounding theory and new kinds of testing. I feel like a PhD student again.

  • I have a block of code I will put in the CLAUDE.md file of any project where I want to get a better understanding of the tech in use where I ask for verbose explanations, forcing me to write some of the code, etc. Mixed results so far but I think it will get there. The one thing that I have decided: only one new thing per project!

  • Interesting. I've felt like it's never been easier to learn things, but I suppose that's not quite the same as "acquiring new skills". I don't know if it applies, but it's always been easy to take the easy way out?

    I feel like AI has made it a bit easier to do harder things too.

  • You are on the internet.

    You can download every book or tutorial ever made in our history.

    We have access to vast knowledge.

You can become a building architect without first becoming a brick mason. Working effectively with AI is a lot more about planning, architecture, directing, etc. the education system will need to adapt, but things are moving so fast I suspect we’re in for a massive shock as the mismatch between education and job role is soon going to be massive.

To me the solution seems simple, but I have no idea how to implement it in a classroom/uni environment.

Students should be building software hands on, yes they should use AI, but there shouldn't be an end state beyond like "6 hours of work" or however long is reasonable in their schedule. The instructor should push them to build more features, or add constraints that obsolete most of their work.

Eventually there will be spots in the code that only the student and professor understands, in some limited instances the professor can explain what some generated code does.

Alternatively students can use generated code, but they have to provide a correctness proof and most of the class is based on studying proofs. Depends if it's a more CS/SE or Software Industry focused group of students and their math background

To me it seems that the path to seniority would shift. It is difficult to answer because we're looking at it from the lens of 'fundamental knowledge'. Instead, to me it seems that now this is less of a requirement compared to 'systems-level thinking'. A very simple example could be the language syntax vs the program structure/parts working together. And with this, a junior developer would still lack this experience and I don't think AI tools would be a problem in developing it.

All I say though is from the perspective of self-taught dev, not a CS student. The current level of LLMs is still far from being a proper replacement to fundamental skills in complex software in my eyes. But it's only in it's worst version it will be from now on.

> so it would be stupid (plus impossible) not to let students use it

It's been plenty of years since my college days, but even back then professors had to deal with plagiarism and cheating. The class was split into a lecture + a lab. In the lab, you used school computers with only intranet access (old solaris machines, iirc) and tests were all in-class, pen-and-paper.

Of course, they weren't really interested at all in training people to be "developers", they were training computer scientists. C++ was the most modern language to be taught because "web technologies" changed too quickly for a four-year degree to be possible, they argued.

Times have changed quite a bit.

Everyone is just hoping, that in five years, when new seniors are needed, that eastern people are seniors by then and cheaper or that ai can replace them.

we figure out the hard way.

it's like when bootcamps were all the rage promising an easy career path, the floor has been raised now, companies will pay a premium for competent devs eventually when they figure it out and it will be an attractive option once again as a career path, but for now it's a shit show.

if 90% of your class turns off their brains when learning with AI then focus on the 10% who understand that you need to crawl first before attempting anything else.

No human devs will be required (or useful except in extreme niches) within a few years. Ten, at the wild maximum, I suspect.