Comment by tptacek

6 days ago

Teaching object-oriented design as fundamental computer science seems like it would have been an odd decision for 2025. I looked up the class syllabus, and it seems to have been taught in Java (fine) and makes extensive reference to design patterns (not fine).

I once asked a professor about this and the answer I got was interesting.

Basically Java and OO and Design Patterns are taught upfront because it turned out this was a huge stumbling block for a lot of people. Bigger than for instance C and pointers. Like it just doesn't click for a lot of people and they end up struggling a lot the rest of the major

So it's not that these are the most crucial concepts, but you want people to "fail fast" and have a sense if they'll succeed in the major within the first year

  • Right, so... just don't teach Java and OO at all? They're not fundamental. Some of the ideas in that course are something closer to discredited.

    Later

    I edited "a lot of" to "some of"; I was coming on too strong.

    • If by discredited you mean incorporated into many of the most used languages as well as several of the hot new ones then yes.

      Furthermore part of PLT is learning the past and what worked and what didn't.

      1 reply →

  • This makes sense. You might not immediately need to design a complex hierarchy, but you'll encounter them pretty early in a lot of languages standard libraries.

  • The first time I tried to learn how to program, I was 12-13 years old and found some video from a university that started by covering Classes, Methods, and Objects.

    I watched one or two lectures and it made no sense to me, so I gave up. I had no idea WTF "objects are like nouns and methods are like verbs" was trying to teach me. I just wanted to make my computer do things.

    Around 14-15 I started playing around with my TI-84 calculator writing simple programs. The TI-84 used a form of BASIC where I could write a program that took INPUT and plugged it into an equation and print it to OUTPUT, and it felt so much more approachable than the neo-neo-platonist OO lectures I'd watched. From there I gradually started writing more complicated code until I eventually started to get why programmers would define functions to stop repeating themselves, or why they might implement custom types.

    > So it's not that these are the most crucial concepts, but you want people to "fail fast" and have a sense if they'll succeed in the major within the first year

    I'd instead posit that so many people "fail fast" with OO because they go into their class being interested in programming, but have no idea wtf is even going on with programming and drop it, because they're forced to learn all this inane trivia [0] and write all this boilerplate (define a class, methods, observability, return by value vs ref) they don't understand before they can even run a program. They think maybe they're stupid or not a good fit for programming and drop it.

    IMO a better teaching language would be one that lets you opt-in to OO and functional features but also lets you write really simple programs like "take a number, multiply it, and print it". I think that's why Python is so popular these days. It helps that the lack of semicolons/curlies, optional typing, and modifiers [1] removes so many distractions and gotchas that stymie absolute novices.

    I also think most CS educators do a very poor job explaining CS concepts to beginners without realizing it. "Methods are like verbs" is absolute nonsense without a moderate to large amount of computer science knowledge to contextualize it. Some of my teachers were actually pretty good, and I also don't remember much acknowledgement that programming didn't have to be this way but that the language/tool was designed that way because that abstraction comes in handy. That'd probably help a lot in retaining students who successfully suffered through their first semester of CS 101 in Java but hated it so much they decided to swear off programming.

    [0] Always start your program with "public static void main(string[] args)" ! Don't worry, that'll make sense in a year, or in six years when one day on the toilet at your software engineering job you realize that it really was a static function returning void that took string[] args

    [1] Static and Foo& are justifiable, although static should arguably be implicit for a beginner language. Forcing students to learn about final, const, val/var, public/private, etc. early on is just stupid. I never understood why these were actually useful, or had a good reason to use them, until I'd already graduated.

    • Brilliant. And correct. The people who created this concepts landed there after going through similar paths like you did (i.e. they started somewhere straightforward but eventually invented these complexities out of necessity.) They understand they "why" behind it.

      But somehow, in the curriculum, we expect complete noobs to just get this abstract, non-relatable concepts without any contexts.

There's some irony here. These courses exist because years ago, CS departments made a shift towards "marketability" in their courses at the expense of "fundamentals". So they introduced Java to replace C++ because that's the language companies hiring new grads were looking for. And now OOP and Java are not skills that companies are looking for in new grads. But instead of acknowledging the facts, the departments are lumping Java+OOP in with "fundamentals" and getting rid of it.

I'm sure if we look back further, C++ displaced some other language (pascal?) for exactly the same reason. And likely the same for the language the proceed C++. I'm just not old enough to have personal experience here.

OOP is one of the major schools of programming, significantly more widespread than functional languages. The only thing arguably used more is "simple" procedural, but even that is doubtful. Sure, in the 90s people thought OOP was the be-all-end-all, and we've moved away from that, but it still makes complete sense to teach the main different styles of programming. C and assembly are basically covered by a course on computer architecture and OSs, then teaching an intro course on both FP and OOP gives students a broader understanding on how the vast majority of program design works. Follow up courses on algorithms can then have synergy by showing how much easier/safer/faster they are to implement depending on the style.

  • I reject the idea that OO is a "major school of programming"; or, if it is, that school has been largely discredited. I think you're on firmer ground if you claim that ideas from OO still inform modern programming (traits and interfaces being a good example).

    I think a lot of 2025 developers would be alarmed to think that a project had started from an object-oriented design perspective.

    • That's a bubble thing. The vast majority of serious software engineering is done in OOP. Java, C# and C++ alone are more than half the market, and then you have Python, Ruby, Kotlin and many more. Even JS has moved largely to (bastard) classes.

      Then you have data (growing above average), scripting and partially frontend that are done differently, but they are still a minority of the job market.

      1 reply →

    • It's still useful sometimes. Right now I have a functional-style program I think would be easier to write if was redesigned in Mixin-style OOP. Usually happens the other way, but not always.

      1 reply →

What's the problem with design patterns? At its core, a design pattern is just a name applied to a frequently used code pattern. It gives us a common language. I can tell you we are using a publisher-subscriber model and you know what that is without me having to be super specific.

  • The concept of a design pattern is ~fine, and people have done good work applying the concept to specific programming domains (concurrency being the best example I can come up with), but "Gang-of-Four"-style design patterns are just workarounds to the limitations of the languages those authors were grappling with at the time, and are deeply silly in modern code, mostly living on today as an AbstractFactoryFactory joke.

If you are going to teach OO in Java, I fail to see why teaching design patterns is so awful.

  • I mostly point it out as a reason why teaching OO as a fundamental is a bad idea in the first place.

    • But OO is a fundamental, even if it isn’t longer gospel.

      OO is important and if I’m three years of computer science there’s no time to teach it then you have to ask what you’d heck they are doing.

      OO is everywhere and it’s an important software concept and is the right solution for certain classes of problem.

      12 replies →