← Back to context

Comment by raddan

5 days ago

I think what is going to happen is that junior devs will develop a strong reliance on AI tools to be able to do anything. I cynically think this was OpenAI’s aim when they made ChatGPT free for students.

I had a rather depressing experience this semester in my office hours with two students who had painted themselves in a corner with code that was clearly generated. They came to me for help, but were incapable of explaining why they had written what was on their screens. I decided to find where they had lost the thread of the class and discovered that they were essentially unable to write a helloworld program. In other words, they lost the thread on day one. Up until this point, both students had nearly perfect homework grades while failing every in-class quiz.

From one perspective I understand the business case for pushing these technologies. But from another perspective, the long term health of the profession, it’s pretty shortsighted. Who knows, in the end maybe this will kill off the group of students who enroll in CS courses “because mom and dad think it’s a good job,” and maybe that will leave me with the group that really wants to be there. In the meantime, I will remind students that there is a difference between programming and computer science and that you really need a strong grasp of the latter to be an effective coder. Especially if you use AI tools.

> Who knows, in the end maybe this will kill off the group of students who enroll in CS courses “because mom and dad think it’s a good job,”

I see this so much. “Data science major” became the 2020s version of law school. It’s such a double edged sword. It’s led to a huge increase in enrollment and the creation of multiple professional masters programs, so the college loves us. We hire every year and there’s always money for just about anything. On the other hand, class sizes are huge, which is not fun, and worse a large fraction of the students appear to have minimal intrinsic interest in coding or analyzing data. They’re there because it’s where the jobs are. I totally get that, in some sense college has always been that way, but it does make me look back fondly on the days when classes were 1/4 as big and filled with people who were genuinely interested in the subject.

Unfortunately I think I may get my wish. AI is going to eliminate a lot of those jobs and so the future of our field looks a bit bleak. Worse, it’s the very students who are going to become redundant the quickest that are the least willing to learn. I’d be happy to teach them basic analysis and coding skills, but they are dead set on punching everything into ChatGPT.

> I cynically think this was OpenAI’s aim when they made ChatGPT free for students

Is there any interpretation that makes sense _other_ than this?

  • It's only free for two months. I'm surprised they don't offer a similar free trial for everyone.

    • There are all sorts of stereotypes about frugal students which have truth, but simultaneously, the education market is in some respects a cash cow. Look at tuition costs, or the cost of textbooks. If they're already spending a lot of largely loaned money for education, I guess charging students and having them think it's an education tool gets them some revenue.

> Up until this point, both students had nearly perfect homework grades while failing every in-class quiz.

This is nothing new. In a computer graphics class I took over 20 years ago, the median score on the assignments before the midterm was >100% (thanks to bonus questions), yet in midterm prep other students in the class were demonstrating that they didn't even have a firm grasp on the basic concept of a matrix.

That is: they were in a 4th year undergrad course, while doubting material from a senior year high school course where they had to have gotten high marks in order to get into the program.

And the midterm grading was heavily curved as a result (though not as much as in some other courses I took).

Students will do what they need to do for the grade. It seems a great many of them have internalized that none of this is about actually learning anything, even if they would never say so aloud. (I learned things - where I didn't already know them - because it was actually interesting. My resulting grades were pretty good overall, but certainly not top of class.)

> Who knows, in the end maybe this will kill off the group of students who enroll in CS courses “because mom and dad think it’s a good job,”

Why would it? It's becoming easier than ever to fake understanding, and to choose anything else they would need both the opportunity and social permission. I only see the problem getting worse.

> Up until this point, both students had nearly perfect homework grades while failing every in-class quiz.

From a student's perspective: I think it was the same with SO. While LLMs make c&p even easier, they also have the upside of lowering the bar on more complex topics/projects. Nowadays, the average person doesn't touch assembly, but we still had a course where we used it and learned its principles. Software engineering courses will follow suit.

  • StackOverflow users at least tried to fight against it. The disdain for anything that looked like "homework questions" was one of the reasons it got a bad reputation among some people.

Hard capitalism doesn't care about long term perspectives, the only yard stick is current performance and stock maximization. Otherwise US would a bastion of stellar public education for example, the investment in long term future of whole nation instead of few richest ones sending their kids to private schools, to stay above the rest.

So while I fully agree with you, this is not a concern for a single decision maker in private company world. And state such as US doesn't pick up this work instead, quietly agreeing with this situation.

Well, think for a second who makes similar budget and long term spending focus. Rich lawyers who chose to become much more rich politicians, rarely somebody else and almost never any more moral profession.