← Back to context

Comment by andai

3 months ago

At one university I went to, the head of the CS department was quoted as saying "[We don't need to care about the job market,] Our job is to create researchers."

I thought that was pretty strange at the time because like 5% of the students end up going into research. So that was basically like him saying I'm totally cool with our educational program being misaligned for 95% percent of our customers...

Maybe it makes sense for the big picture though. If all the breakthroughs come from those 5%, it might benefit everyone to optimize for them. (I don't expect they would have called the program particularly optimized either though ;)

Well you can say there is a difference between "computer science" and "software engineering", plus many "universities" are particularly research focused.

A chemistry, physics, or even MechE BS is coming out only at the very beginning of their training, and will require lots of specific on-the-job training if they go into industry. School is about the principles of the field and how to think critically / experimentally. E.g. software debugging requires an understanding of hypothesis testing and isolation before the details of specific tech ever come into play. This is easy to take for granted because many people have that skill naturally, others need to be trained and still never quite get it.

Edit: of course if only 5% of grads are going on to research then maybe the department is confused. A lot of prestigious schools market themselves as research institutions and advertise the undergrad research opportunities etc. If you choose to go there then you know what you're getting into.

  • >A lot of prestigious schools market themselves as research institutions

    Out of one side of their mouth maybe.

    Out of the other, they absolutely are not telling potential undergrads that they may tolerate them but they're really focused on research.

> I don't expect [the 5% of students who end up going into research] would have called the program particularly optimized either

This. I went to the University of Iowa in the aughts. My experience was that because they didn't cover a lot of the same material in this MIT Missing Semester 2026 list, a lot of the classes went poorly. They had trouble moving students through the material on the syllabus because most students would trip over these kinds of computing basics that are necessary to experiment with the DS+A theory via actual programming. And the department neither added a prereq that covers these basics or nor incorporated them into other courses's syllabi. Instead, they kept trying what wasn't working: having a huge gap between the nominal material and what the average student actually got (but somehow kept going on to the next course). I don't think it did any service to anyone. They could have taken time to actually help most students understand the basics, they could have actually proceeded at a quicker pace through the theoretical material more for the students who actually did understand the basics, they could have ensured their degree actually was a mark of quality in the job market, etc.

It's nice that someone at MIT is recognizing this and putting together this material. The name and about page suggest though it's not something the department has long recognized and uncontroversially integrated into the program (perhaps as an intro class you can test out of), which is still weird.

  • >It's nice that someone at MIT is recognizing this and putting together this material. The name and about page suggest though it's not something the department has long recognized and uncontroversially integrated into the program (perhaps as an intro class you can test out of), which is still weird.

    While this comes out of CSAIL, I wouldn't ascribe too much institutional recognition to this. Given the existence of independent activities period, it's probably a reasonable place for it given MIT's setup. Other institutions have "math camp" and the like pre-classes starting.

    It's probably a reasonable compromise. Good schools have limited bandwidth or interest in remedial education/hand-holding and academics don't have a lot of interest in putting together materials that will be outdated next year.

    • > Good schools have limited bandwidth or interest in remedial education/hand-holding and academics don't have a lot of interest in putting together materials that will be outdated next year.

      I think they rarely escape doing this hand-holding unless they're actually willing to flunk out students en masse. Maybe MIT is; the University of Iowa certainly wasn't. So they end up just in a state of denial in which they say they're teaching all this great theoretical material but they're doing a half-assed job of teaching either body of knowledge.

      I also don't think this knowledge gets outdated that quickly. I'd say if they'd put together a topic list like this for 2006, more than half the specific tools would still be useful, and the concepts from the rest would still transfer over pretty well to what people use today. For example, yeah, we didn't have VS Code and LSP back then, but IDEs didn't look that different. We didn't (quite) have tmux but used screen for the same purpose. etc. Some things are arguably new (devcontainers have evolved well beyond setting up a chroot jail, AI tools are new) but it's mostly additive. If you stay away from the most bleeding-edge stuff (I'm not sure the "AI for the shell (Warp, Zummoner)" is wise to spend much time on) you never have to throw much out.

      6 replies →

Historically, the point of a university is not to be a jobs training program.

  • It kind of depends on how you define "history". Before STEM dominated the hiring landscape, Universities were less career focused. No employers in these fields, as far as I know, have ever offered apprenticeships to teach new hires chemical engineering or applied mathematics from the ground up. University will not prepare you for a corporate job, exactly, but it gives you a background that lets you step into that, or go into research, etc. Lots of employers expect new hires to have research skills as well.

    I think there are a number of ways in which financial incentives and University culture are misaligned with this reality.

Probably one of those thoughts you should self-filter (and the alumni association sure wishes you would).

But it's also the case that (only half-joking) a lot of faculty at research universities regard most undergrads as an inconvenience at best.

  • It depends on the university, but the filtering thing is very true.

    In my experience, the more advanced the material, the worse the teachers are. Or more precisely, the improvement in teaching does not usually keep up with the increase in difficulty. (There appears to be no correlation, in fact.)

    Which implies that the better a university is (the more difficult the material), the more it relies on filtering rather than education.

    Which seems to be in line with how the top universities are perceived anyway as selection criteria, primarily places to get a network, rather than places to get an education.

    It's neither good nor bad, but it is a little sad :)

    ---

    I do notice that my assumption here is that the more difficult the university is, the better it is. I think this is broadly true, both objectively and subjectively, at least for my purposes.

They like to say things like that or some version of "we want to teach the concepts, the specific technology changes too fast". Does it? Just seems lazy to me.

In some schools they have a separate degree program in informatics or computer technology, for precisely this reason -- computer science is a different field.