← Back to context

Comment by ang_cire

5 days ago

One thing that really bothered me that the author glossed over (perhaps they don't care, given the tone of the article) is where they said:

> Does an intern cost $20/month? Because that’s what Cursor.ai costs.

> Part of being a senior developer is making less-able coders productive, be they fleshly or algebraic.

But do you know what another part of being a senior developer is? Not just making them more productive, but also guiding the junior developers into becoming better, independent, self-tasking, senior coders. And that feedback loop doesn't exist here.

We're robbing ourselves of good future developers, because we aren't even thinking about the fact that the junior devs are actively learning from the small tasks we give them.

Will AI completely replace devs before we all retire? Maybe. Maybe not.

But long before that, the future coders who aren't being hired and trained because a senior dev doesn't understand that the junior devs become senior devs (and that's an important pipeline) and would rather pay $20/month for an LLM, are going to become a major loss/ brain drain domestically.

I think what is going to happen is that junior devs will develop a strong reliance on AI tools to be able to do anything. I cynically think this was OpenAI’s aim when they made ChatGPT free for students.

I had a rather depressing experience this semester in my office hours with two students who had painted themselves in a corner with code that was clearly generated. They came to me for help, but were incapable of explaining why they had written what was on their screens. I decided to find where they had lost the thread of the class and discovered that they were essentially unable to write a helloworld program. In other words, they lost the thread on day one. Up until this point, both students had nearly perfect homework grades while failing every in-class quiz.

From one perspective I understand the business case for pushing these technologies. But from another perspective, the long term health of the profession, it’s pretty shortsighted. Who knows, in the end maybe this will kill off the group of students who enroll in CS courses “because mom and dad think it’s a good job,” and maybe that will leave me with the group that really wants to be there. In the meantime, I will remind students that there is a difference between programming and computer science and that you really need a strong grasp of the latter to be an effective coder. Especially if you use AI tools.

  • > Who knows, in the end maybe this will kill off the group of students who enroll in CS courses “because mom and dad think it’s a good job,”

    I see this so much. “Data science major” became the 2020s version of law school. It’s such a double edged sword. It’s led to a huge increase in enrollment and the creation of multiple professional masters programs, so the college loves us. We hire every year and there’s always money for just about anything. On the other hand, class sizes are huge, which is not fun, and worse a large fraction of the students appear to have minimal intrinsic interest in coding or analyzing data. They’re there because it’s where the jobs are. I totally get that, in some sense college has always been that way, but it does make me look back fondly on the days when classes were 1/4 as big and filled with people who were genuinely interested in the subject.

    Unfortunately I think I may get my wish. AI is going to eliminate a lot of those jobs and so the future of our field looks a bit bleak. Worse, it’s the very students who are going to become redundant the quickest that are the least willing to learn. I’d be happy to teach them basic analysis and coding skills, but they are dead set on punching everything into ChatGPT.

  • > I cynically think this was OpenAI’s aim when they made ChatGPT free for students

    Is there any interpretation that makes sense _other_ than this?

  • > Up until this point, both students had nearly perfect homework grades while failing every in-class quiz.

    This is nothing new. In a computer graphics class I took over 20 years ago, the median score on the assignments before the midterm was >100% (thanks to bonus questions), yet in midterm prep other students in the class were demonstrating that they didn't even have a firm grasp on the basic concept of a matrix.

    That is: they were in a 4th year undergrad course, while doubting material from a senior year high school course where they had to have gotten high marks in order to get into the program.

    And the midterm grading was heavily curved as a result (though not as much as in some other courses I took).

    Students will do what they need to do for the grade. It seems a great many of them have internalized that none of this is about actually learning anything, even if they would never say so aloud. (I learned things - where I didn't already know them - because it was actually interesting. My resulting grades were pretty good overall, but certainly not top of class.)

    > Who knows, in the end maybe this will kill off the group of students who enroll in CS courses “because mom and dad think it’s a good job,”

    Why would it? It's becoming easier than ever to fake understanding, and to choose anything else they would need both the opportunity and social permission. I only see the problem getting worse.

  • > Up until this point, both students had nearly perfect homework grades while failing every in-class quiz.

    From a student's perspective: I think it was the same with SO. While LLMs make c&p even easier, they also have the upside of lowering the bar on more complex topics/projects. Nowadays, the average person doesn't touch assembly, but we still had a course where we used it and learned its principles. Software engineering courses will follow suit.

    • StackOverflow users at least tried to fight against it. The disdain for anything that looked like "homework questions" was one of the reasons it got a bad reputation among some people.

      3 replies →

  • Hard capitalism doesn't care about long term perspectives, the only yard stick is current performance and stock maximization. Otherwise US would a bastion of stellar public education for example, the investment in long term future of whole nation instead of few richest ones sending their kids to private schools, to stay above the rest.

    So while I fully agree with you, this is not a concern for a single decision maker in private company world. And state such as US doesn't pick up this work instead, quietly agreeing with this situation.

    Well, think for a second who makes similar budget and long term spending focus. Rich lawyers who chose to become much more rich politicians, rarely somebody else and almost never any more moral profession.

It's a bit misleading to compare $20/month with an actual human person. The junior dev wont get half way through the day and tell you they've used up all their coding time for the month and will now respond with jibberish.

Cursor is a heck of a lot more than $20/month if you actually want it working for a full work day, every day.

  • > The junior dev wont get half way through the day and tell you they've used up all their coding time for the month and will now respond with jibberish.

    This issue manifests a bit differently in people, but I've definitely worked with people (not only juniors) who only have a few productive hours a month in them. And for what it's worth, some of those people were sufficiently productive in those few hours that it was rational for the company to keep them.

  • I worked on a team where a new hire from a prestigious school told his manager "That work is boring and mundane and I'm not going to do it."

    He didn't last long.

    • Yeah, so you fire them and replace them with another human, thats still vastly cheaper than a person plus a per-token ai fee.

    • And it wouldn't surprise me if he is now the boss of people like his prior boss.

  • My coworkers are burning 10k/month on cursor.

    • how?? I consider myself pretty heavy handed with letting Gemini fill up its 1M token context window for individual tasks, and I don't exceed $20 per day. Do they just let the agent spin in a loop for 4 hours?

  • Maybe - but it will still probably be less than a junior dev.

    You could probably hammer the most expensive cursor API all-day every-day and it would still be a fraction of the cost of a junior dev.

Further, Cursor might cost $20/month today, but to what degree is that subsidized by VC investment? All the information we have points to frontier models just not being profitable to run at those types of prices, and those investors are going to want a return at some point.

  • the market will indeed balance this out. remember when a taxi was $20 and an uber $5? now an uber is $25. nobody is going to go back to humans with all their wet meat sack problems, we will get more value for it, but it aint gona stay $5 if those putting up all this capital have anything to do with it. then again, we might get cheap, self hostable local copies (unless theyre made illegal for "safety" or some bullshit)

    • I think the most likely thing is the cheap self hostable copies will broadly stop improving significantly. It'll be too costly for a community project to distill a bleeding edge cloud model and companies will stop releasing them. What's free now will remain free, we might even get another gen or 2 of improvements (possibly with diminishing returns) on free/cheap local models but those days are numbered.

      2 replies →

    • sounds like uber is ripe for disruption by somebody who doesn’t need the accounts to balance yet

  • I dunno, with the advances in open source models I could see in a few years having AI workstations that cost $20,000 with 1TB of VRAM so you don’t have to rely on OpenAI or Cursor. The RTX 6000 Pro is only $7500 and has 96GB of VRAM.

This is something that's been simmering in the back of my mind for a while. Using an AI Agent instead of talking to your human colleagues deprives both of you from learning opportunities. There are probably short term gains in many cases, but I fear there will be long term losses over time.

  • I agree, and think that organizations that figure out how to use AI well in a collaborative way will succeed in the long-term. Developer community is still where the important growth happens.

  • Is it possible to talk to coworkers? What if you voted for wrong party? Are closeted gay/trans/qeer? Radical femimist?! Or dog atrack survivor, and they really _REALLY_ like dogs!

    Talking to colleagues at work is a chore, and huge risk! Not opportunity! At least AI respects my privacy, and will not get my fired!

> But do you know what another part of being a senior developer is? Not just making them more productive, but also guiding the junior developers into becoming better, independent, self-tasking, senior coders. And that feedback loop doesn't exist here.

Almost every senior developer I know is spending that time making LLM's more productive and useful instead.

Whatever you think the job is of the senior developer, it will not be "coding".

I think people need to stop thinking of themselves as computer programmers and start thinking of themselves as _engineers_. Your job isn't writing programs, your job is _using the technology you have available to solve problems_. Maybe that is through writing code, but maybe it's orchestrating LLM's to write code for you. The important part is solving the problem.

  • > Almost every senior developer I know is spending that time making LLM's more productive and useful instead.

    LLMs may become more productive/ accurate/ useful, but they're not self-tasking or independent.

    > I think people need to stop thinking of themselves as computer programmers and start thinking of themselves as _engineers_. Your job isn't writing programs, your job is _using the technology you have available to solve problems_.

    There is a progression of skill required to master any profession, starting with fundamentals, and progressing and developing until you are an expert/ senior at that profession. How is a senior sw dev supposed to become that without writing code? Just reading LLM code and bugfixing isn't the same level or kind of experience. You're going to have devs who can't code by themselves, and that's a bad place to be in.

    There are already too many people in IT using tools that they don't understand the workings of (and thus can't troubleshoot, can't replace, can't customize to their env, etc), and this will just exacerbate that x100.

    MMW there is going to be a very bad skill deficit in IT in 20 years, which is going to cause an innovation deficit.

How many senior developers understand the minute, intimate details of the frameworks, libraries, languages they use? How many understand the databases they use? TFA says, many (but not all) don't have to care as long as the product ships. That's exactly how code written by LLMs is meant to be tested and evaluated. And if you set up a good enough build/test environment, TFA argues that you can automate most of the schelp away.

  • In my experience, senior engineers without the curiosity to occasionally dig deeper into their frameworks are significantly worse engineers. No framework, library, or language is perfect. A big part of your job is understanding how your code will execute. Sometimes that even requires, at a very high level, imagining how that code will compile down to assembly once you strip away all the abstractions.

    Eventually you will get a memory leak even in a GCd language. Eventually there will be some incredibly obscure, unreported bug in a library. Eventually you will find an issue in unmaintained code you depend on. Eventually there will be performance problems caused by too many layers of abstraction.

    You either need to know, roughly, how your dependencies work by occasionally digging into their code/reading the documentation. Or you need intuition to know how it probably works, but you usually build that intuition by actually writing/reading code.

    • The people who want to look under the hood even if they have no immediate reason to do so will always exist, and the people who don't care and just learn the bare minimum to produce app features that usually work will always exist. All LLMs do is decrease the skill requirement to be developer B, but they also make it easier to learn what you need to be developer A.

      2 replies →

Well i hope that happens, you go apply for a job already 10k applied. What kind of job market is this.

>We're robbing ourselves of good future developers

You call it robbing ourselves of good future developers, I call it hourly consultancy rate increase.

I imagine it like this. Juniors will be taught by LLMs on some things, but seniors will still be there, they will still assist, pair program, code review, etc. but they will have another party, the LLM, like a smarter calculator.