Comment by someguyiguess

7 days ago

To be fair, college CS programs have always been decades behind in my experience. Maybe schools like Stanford and MIT are different but the majority of CS programs are not teaching tech that is actually used in the business world.

Maybe I’m an oddball, but I’d rather hire a new grad with sound fundamentals, but learned on an older tech stack, then somebody with all the buzzwords but no fundamentals.

And I’ve always found summer internships to be good way to find out. Even better if the candidate is willing to work part-time through their senior year.

  • Yeah. I see a phrase like “hirable skills” and… it feels like “skills” that are probably going to be outdated every couple of months.

    • 100%.

      For me, "hireable skills" (for a new grad) are things like "can do a basic whiteboard exercise". I'll ask them to sketch out a program to solve a business problem. I do higher ed software, so usually start with "build a class registration system from scratch" - they're recent grads, so the problem domain is known; there's plenty of space to discussion to move in several different directions; fits nicely in 20-30 minutes.

      Bare minimum, I'd expect them to ask clarifying questions (particularly around system constraints, performance, etc). And then sketch out a very basic system diagram (I don't expect them to know AWS or Azure, but do want to see things like "ID provider", "course catalog", "waitlist service", etc. Then I'll pick a service and have them pseudocode some of it.

      Sadly, somewhere around 50% of grads CANNOT do the above. I'm not sure how, but I've left interviews thinking "I hope they get a refund" more than a few times.

  • The Pythagoras theorem doesn’t change even if you use an LLM. Fundamentals shouldn’t either. Don’t see why schools should see this any differently.

    • > The Pythagoras theorem doesn’t change even if you use an LLM.

      Indeed. But it does change if you want an answer on a non-Euclidian surface, e.g. big scale things on the surface of Earth where questions like "what's a square?" don't get the common-sense answer you may expect them to have.

      I bring this up because one of my earlier tests of AI models is how well they can deal with this, and it took a few years before I got even one correct answer to my non-Euclidian problem, and even then the model only got it correct by importing a python library into a code interpreter that did this part of the work on behalf of the model.

  • You're sound.

    The problem is when you've got a new grad with no fundamentals and 10 year old buzzwords.

    I've had the unfortunate problem of having worked with someone who was that, except not even a new grad, they'd been at the same project for something like a decade and were between 10 and 20 years our of date in how to think about both what computers now did under-the-hood with the code and also didn't understand the fundamentals of writing that code in the first place.

  • Yes it is just you. Every application for a job gets hundreds of applications. A company is not going to hire someone with no experience or knowledge over someone who does.

    • > A company is not going to hire someone with no experience or knowledge over someone who does.

      “Solid fundamentals” are literally knowledge.

      That said, you’re probably right. At least in data, hundreds of mediocre-to-awful hiring managers have convinced themselves that their stack is special and there’s no way someone without experience in Snowflake (or whatever) could possibly figure it out based on experience in other stacks.

      On the plus side, it’s meant that anyone who’s not intentionally shooting themselves in the foot can find a ton of high end talent because they recognize that know a specific language is valueless compared to understanding how to code in the first place.

      1 reply →

  • I mean yeah, I agree, but is it that hard to keep relevant technology in the mix? I'm not saying everything has to be cutting edge!

    • Sure, but are C++ or Java really that outdated. AFAIK that’s what most schools teach. Maybe with some JavaScript as well. It’s not lime they’re teaching Fortran or COBOL.

      And with the advent of AI coding, I’d hope they can spend more time on system design, as that’s where I’ve found new grads are generally lacking.

      5 replies →

    • Many professors view teaching as a secondary obligation. Even if they don't it takes more time to learn to teach something than just to learn it. Our field is moving so fast that outside of the major innovations, it would be quite difficult to keep up being a good teacher on everything, while also doing research, and doing the actual teaching. In addition, most new tech isn't very interesting, or useful. Like every couple of months I'm getting another peak at SOTA Python or JS and the "innovation" is just another layer of duct tape that doesn't really improve much.

      Cool tech usually also sees faster adoption in academia. Rust courses where offered at the uni I went to back in 2017 for example. According to my friends still involved with uni, there was also a strong shift towards more data science/engineering and HCD since then, both fields that saw major practical improvements.

When I was in CS, we were taught theory. If you wanted to be caught up with the current tech, you'd teach yourself.

  • That was my experience in the 80's - we were taught theory, we had to apply the theory in projects so we spent lots of time programming and getting stuff working - but we were pretty much expected to pick up particular languages, operating systems or libraries by ourselves.

    The CS theory (i.e. maths based) side of it really has stuck with me - only other thing being vi controls being hardwired in my brain even though I went on to become more of an emacs fan...

Which is a good thing. They should be teaching the cornerstone principles, not offering vocational courses.

  • I think having one or two "software engineering" courses where it's project-based really helps. You get to actually learn how to use Git, work in a team, and architect and finish a project on time - which is going to be valuable no matter if you're seeking a software engineering job afterwards or stay in academia.

    • Seriously I had multiple courses on Java and C++ but not a single mention of version control. It wasn't even an elective as far as I remember.

      A course where 3+ students build something together in a single repo and the professor can view the commit and PR history would be amazing.

      1 reply →

  • my old CS prof at my uni used to say when this question came up "do you sign up for an astronomy course and expect they teach you how to build a telescope?"

    It's always puzzled me why people sign up for an academic education that has 'science' literally in the name and then complain when they get a theoretical education. It's not a tool workshop

    • Well the issue is the majority of people study CS to become software engineers not academics in CS. There are only a small number of software engineering degrees at select universities, so CS is the de facto route to becoming a SWE. So it’s not unreasonable students would want a bit of practical industry education in their CS degree.

      I’m actually surprised with as much money is in tech that there hasn’t been more influence towards shaping curriculum to be more industry relevant. Companies waste tons of money ramping up new grads and bridging the CS to SWE gap, surely the incentives are there for a different curriculum.

    • A single course... no, but if I was majoring in astronomy I would expect to understand telescopes enough to put one together.

  • You think most people spend tens of thousands of dollars on college and expect not to be employable?

    • What you're after is a technical college with vocational training. They are definitely important and have their place. Some students at university would be better off going to technical colleges. But technical colleges are not the same as what universities do.

      1 reply →

    • Exactly. Why are people so repulsed by the idea a college degree may give you some kind of employable skill. Like, “Ew gross, they taught git in CS 101, how dare they degrade the purity of our scientific education.

The best CS programs teach a lot of tech that is not used in the business world. The they're often too theoretically or too experimental.

This is CMU so they would be at the bleeding edge just like MIT/Stanford. But I think all the schools are behind today