Comment by deactivatedexp

6 days ago

i'm a dropout just do horrible coding for fun can anyone explain why CS programs are adamant about not teaching on the job skills? like databases, web, embedded, security plus theory

Most Computer Science programs teach principles.

The ideas behind a database is more important than the the concrete database used. If the ideas are taught, then the students can adapt when the fashion changes.

So instead of teaching something that you can use on your first job, they teach principles you can use your entire career. And times does change.

Remember the good old languages Fortran, Algol 60 etc. That's where the jobs were...

I graduated with a degree in CS. Checking back with my university they'd added a degree in Software Engineering.

CS (traditionally) is more about algorithms, limits, growth. There's a bunch of complexity in even just understanding and calculating what a computer can do.

The Software Engineering degree makes so much sense if you're studying about how libraries can be tied together, upgraded, unit/integration testing, container and source code management, etc.

They are truly different disciplines at this point.

  • sorry if this sounds dumb I don't have any formal education why is there a distinction?

    when i hear computer science i hear the study of computers that in my head means software, hardware, and theory it's like having doctors that study humans theoretically and others that do surgery

    • Computer science is step one towards getting a phd and doing research and design of actual tools like compilers, understanding how the CPU sees and asks for data stored in memory, efficient algorithms to do XYZ, it's highly relevant for stuff like embedded computing where you're often working with severe hardware constraints and possibly with a real-time os.

      Software Engineering is the study of how to efficiently build software, typically on a medium to large team.

    • Computer Science as a field predates computers. Computer Science is the subsection of maths that answers "what is computable".

      Dijkstra line sums it up well "Computer science is no more about computers than astronomy is about telescopes."

    • Analysis is the key. You write a program to multiply matrices. That is software engineering. You analyze the program to figure out how much slower it becomes as the matrices becomes larger, how many times it must access ddr, how well-utilized the caches are, etc. That is computer science.

    • Computer science is designing a new video compression algorithm to be more efficient. Software engineering is taking that video compression algorithm and putting into an application users actually use.

    • I think the analogy

      Software Engineers : Doctors :: Computer Scientists : Biologists/Biomedical Scientists

      is perhaps reasonable.

That heavily depends on the school. But I’ll add many years later, I appreciate the theoretical classes much more because the knowledge I gained there has aged much better.

In our pretty dynamic fast moving industry, it is more important to learn how to learn technologies, rather than learn the technologies directly. No one wants to be a specialist in 20 year old outdated tech late in their career. Having a strong foundation in how CS tech generally works allows you to move between different tech stacks fairly quickly.

Pretty much every class that focused on "on the job skills" when I went to university (2008-2014) either at my university or from looking at coursework available elsewhere (I might have considered jumping universities for non-educational reasons) was either outdated or useless within few years. If I were to redo it I'd prefer to have even more "esoteric" tech involved and more theory and fundamentals.

And not because they were outdated and useless when the course started, though some did have issues. I think the least outdated "on the job applicable" material is Java fundamentals, and that's because you can still write Java 6 basic stuff in current JDK - a lot of the interesting and powerful stuff we learnt is no longer available in JDK though.

OTOH, fundamental principles - algorithms, including set logic in RDBMSes, low-level programming (which to annoyance of some was done in SPARC assembly or random assembly designed just for given assignment), robotics using programming stack and parts that were never seen outside university, various in-depth studies on different theoretical or scientific areas - all of that is material I use to this day in many different jobs when it allows me to understand and reason in ways I couldn't before I joined university - and I already "knew" how to program then.

BTW, within the 5 years I spent at university, "javascript on the server" turned from niche use cases within some stacks, usually running Mozilla Rhino, into complete new ecosystem that was surviving its first big fork and was becoming used in big projects, Android went from one phone "who knows" to pretty mature platform holding half the world with 64bit CPUs, tablets and first wearables, similar evolution for iOS, in fact arguably over my entire university time we had explosion of "you can now make mobile apps easier" to "mobile app developer is lucrative" all the way to "you're probably not going to make much money as solo developer anymore".

It's stunning to work with fresh out of college cs grads and... having to take 10 minutes to explain how git works, why we use source control, and then every day fielding questions about git, merges, and fixing merge conflicts etc as the get up to speed. I realize there is a learning curve as someone fresh out of school but also there should be some basic job training using free versions of tools like git and sql. The average student is only going to get a shadow of the relevance if they're learning pure theory with no context.

  • These are tools! You don't go to university to learn how to use git. Besides, tools that come and go. You can pick these up along the way. You can make their use a part of coursework, but this is not what you go to university to study!

    That's the point. CS curricula are supposed to teach you deep skills and principles, not how to fiddle around with git.

    • If your CS degree never requires you to write a program complex enough to warrant version control you should get your money back. I agree that the theory is more important and tools should not be focused on, but exercising the theory you have learnt is extremely rewarding.