Comment by hellohowareu

3 years ago

I think the skill depreciation concept is a bit exaggerated. Especially in the context of the newness of computers relative to the engineering field in general, the latter which has been around, arguably, for thousands of years.

For example, SQL & Unix have been around since the 1970s.

Linux since late 1991.

Javascript: The end of 1995. NodeJS: 2009.

Sure, there's a ton of churn in the JS Ecosystem, but all it takes is a bit of wisdom, skepticism, and patience to avoid the hype-cycle.

Also, once you learn to build certain things with a programming language you learn the paradigms of the system you build.

For example-- Web Servers. Looking at ExpressJS vs Python Flask documentation, there are many analogous pieces, because they follow the same standards for protocols.

Another example-- data engineering / statistical computing: Checking out R vs Python packages, there are a lot of the same concepts, just in a slightly different format/language.

HTTP/1.0: 1996 (RFC 1945)

TCP: 1974 "In May 1974, Vint Cerf and Bob Kahn described an internetworking protocol for sharing resources using packet switching among network nodes."

"TLS is a proposed Internet Engineering Task Force (IETF) standard, first defined in 1999"

Considering all of this... I don't think most major things actually change that much. Sometimes a popular new framework takes the world by storm, but that's pretty rare compared to the output and churn of the ecosystem.

The issue is that every job I’ve had requires learning a bunch of new shit. Rarely am I just transferring over to the same languages or frameworks. Add on that each company decides different design patterns that they want to utilize and has a different interpretation of what REST is and what HTTP status codes are… it’s a pain in the ass to be an expert in any good amount of time. Expert being one that can dive into the true weeds like cryptic memory leaks that require special profiling tools that aren’t documented anywhere - etc. (and able to do this at a moments notice with ease)

Especially if you’re a full stack eng who is constantly swimming over the entire stack and they keep pushing new DBs, new logging tools, etc.

There are commonalities but it is a lot of learning as you go. I used to know Angular pretty well but now I don’t remember it at all. I haven’t even gotten to really ramp on React as much because my company uses it in such a terrible way that it’s clearly not fit for.

  • I stopped being full stack for this reason. It's too much effort to keep up with the entire stack and companies don't compensate great full stack devs more than great front end or back end devs. I think it's just a marketing ploy to get unassuming youngsters to spend more time at work being full stack so they can try to pay one person to do two jobs.

From the perspective of whoever is experiencing it, it doesn't really matter how it fits into the historical picture. What you experience is sitting in front of your screen while your family is having dinner, not points on a multi-decade timeline.