Comment by t_mann

3 years ago

Some really good points on the ultra-fast depreciation of SE tech skills. A relative of mine is a mechanical engineer, well past retirement age and still going strong in his 2-man consulting shop because that's what he loves doing. He works with precision manufacturers, automotive suppliers,... all very cutting-edge stuff, helping them develop new product lines, manufacturing processes,... He says the core skills that he's using are still those that he learnt in university several decades ago.

I was actually surprised because I heard so much about crazy new materials like carbon fibers, graphene, technologies like 3D-printing,... but apparently what makes a machine break are still the same things: mechanical stress, heat dissipation, friction,... new materials and processes might change the coefficients, but not the (mostly Newtonian) physics (my interpretation btw, not his words).

One could say the same thing about software engineering - true fundamental advances in algorithms and data structures are sufficiently rare that it wouldn't be a nuisance to keep up with them. But the %-age of how important those basics are relative to the extremely fast-changing landscape of tools and frameworks is much smaller (plus, one could argue that even the fundamentals see a lot of shifting ground in CS, with neural architectures, differentiable programming, not to mention quantum computing).

I think the skill depreciation concept is a bit exaggerated. Especially in the context of the newness of computers relative to the engineering field in general, the latter which has been around, arguably, for thousands of years.

For example, SQL & Unix have been around since the 1970s.

Linux since late 1991.

Javascript: The end of 1995. NodeJS: 2009.

Sure, there's a ton of churn in the JS Ecosystem, but all it takes is a bit of wisdom, skepticism, and patience to avoid the hype-cycle.

Also, once you learn to build certain things with a programming language you learn the paradigms of the system you build.

For example-- Web Servers. Looking at ExpressJS vs Python Flask documentation, there are many analogous pieces, because they follow the same standards for protocols.

Another example-- data engineering / statistical computing: Checking out R vs Python packages, there are a lot of the same concepts, just in a slightly different format/language.

HTTP/1.0: 1996 (RFC 1945)

TCP: 1974 "In May 1974, Vint Cerf and Bob Kahn described an internetworking protocol for sharing resources using packet switching among network nodes."

"TLS is a proposed Internet Engineering Task Force (IETF) standard, first defined in 1999"

Considering all of this... I don't think most major things actually change that much. Sometimes a popular new framework takes the world by storm, but that's pretty rare compared to the output and churn of the ecosystem.

  • The issue is that every job I’ve had requires learning a bunch of new shit. Rarely am I just transferring over to the same languages or frameworks. Add on that each company decides different design patterns that they want to utilize and has a different interpretation of what REST is and what HTTP status codes are… it’s a pain in the ass to be an expert in any good amount of time. Expert being one that can dive into the true weeds like cryptic memory leaks that require special profiling tools that aren’t documented anywhere - etc. (and able to do this at a moments notice with ease)

    Especially if you’re a full stack eng who is constantly swimming over the entire stack and they keep pushing new DBs, new logging tools, etc.

    There are commonalities but it is a lot of learning as you go. I used to know Angular pretty well but now I don’t remember it at all. I haven’t even gotten to really ramp on React as much because my company uses it in such a terrible way that it’s clearly not fit for.

    • I stopped being full stack for this reason. It's too much effort to keep up with the entire stack and companies don't compensate great full stack devs more than great front end or back end devs. I think it's just a marketing ploy to get unassuming youngsters to spend more time at work being full stack so they can try to pay one person to do two jobs.

  • From the perspective of whoever is experiencing it, it doesn't really matter how it fits into the historical picture. What you experience is sitting in front of your screen while your family is having dinner, not points on a multi-decade timeline.

In the last 20 years I have seen mostly ultra-fast depreciation of SE _interviewing_ skills.

After the first ten years software development becomes quite intuitive and you internalize all those best practices. You can be trusted to start a new service from an empty git repository. Later it gets incremental, there's a lot of path dependency in languages and frameworks and few things come out of the blue. Those that do are frequently intellectually stimulating to learn.

But interviews have been steadily getting strange and difficult (in a way not related to real life software development), at least in the last 10 years.

  • Very true. This stuff started at places like Google and Facebook and it makes a sort of sense for them as right or wrong they're very focused on hiring new college grads. With no real work experience to speak of you can do a lot worse than hire the ones that show they can apply their CS coursework to leetcode problems.

    But doing the same to workers with 10 years of real world experience doesn't make nearly as much sense. Like hiring medical doctors by quizzing them on organic chemistry problems. Google and Facebook do it because they can and they don't know what else to do, but I don't understand how it became a universal practice.

    • Yeah. I agree. I have "Cracking the Coding Interview" on my bookshelf. There's a lot of very good stuff in it. I enjoy thumbing through it. But I keep waving it at my boss saying "I will _not_ do this to people with several years of experience ~ This book is _not_ going to be my blueprint for interviewing"

>I was actually surprised because I heard so much about crazy new materials like carbon fibers, graphene, technologies like 3D-printing,... but apparently what makes a machine break are still the same things: mechanical stress, heat dissipation, friction,... new materials and processes might change the coefficients, but not the (mostly Newtonian) physics (my interpretation btw, not his words).

I don't really see the difference between that an programming. Writing code is still a bunch of "if" statements, the same underlying data structures, same algorithms, etc. There's some new technology being added on the top akin to carbon fiber and the other things you mentioned but it's fundamentally the same.

  • You really think that a SE in their 70's who learnt how to write if statements and data structures 50 years ago would say that they're still basically doing the same thing now as back then? Maybe if they work on legacy stacks like the famed COBOL devs coming back out of retirement. But the thing is that what he's working on is cutting edge, not maintaining systems that were built decades ago.