← Back to context

Comment by masfoobar

4 days ago

> The industry and the academy have used the term “object-oriented” to mean so many different things. One thing that makes conversations around OOP so unproductive is the lack of consensus on what OOP is.

There has been different terms and meaning to it - but we all know the "OOP" thrown about since the mid-to-late 90s is the Java way.

A typical Java book back then would have 900 pages.. half of which is explaining OOP. While not focusing fully on Java, it does help transition that knowledge over to Delphi or C++ or.. eventually.. C#, etc.

Overall -- we all knew what "Must have good OOP skills" means on a job advert! Nobody was confused thinking "Oh.. I wonder which OOP they mean?"

I have a love/hate relationship with OOP. If I have to use a language that is OOP by default then I use it reasonably. While the built in classes will have theor own inheritence -- I tend to follow a basic rule of no higher that 2. Most of the time it is from an interface. I prefer composition over inheritence.

In C#, I use static classes a fair bit. In this case, classes are helpful to organise my methods. However, I could do this at a namespace level if I could just create simple functions -- not wrapped inside a class.

OOP has its place. I prefer to break down my work with interfaces. Being able to use to correct implementation is better than if/switch statements all over the place. However, this can be achieved in non OOP languages as well.

I guess my point is that OOP was shoved heavily back in the day. It was shutup and follow the crowd. It still has it's place in certain scenarios - like GUI interfaces.