Comment by goldenshale
3 years ago
I think this article hits on some truths and gets a handful of things wrong. First, they are correct that we are in a dynamic profession that requires constant learning and expanding. No doubt the people who choose to stay in software are likely to be people who are curious, life-long learners where this is a benefit of the profession rather than a drawback. That said, one thing I noticed from teaching computer science while a graduate student is that the poor students think about languages and libraries as key skills, while the better students think about the data structures, algorithms, and design principles. These slightly more meta-level concepts are far more stable and timeless, while the underlying implementations of them are constantly evolving. If you think of each new library as "having to learn a new skill" I can imagine burnout and overload are more likely, but if you figure once you know 3D graphics you can pretty easily pickup any new graphics engine, then it might even seem fun to explore different takes on the same old concepts. It's like hearing a new band in your favorite genre.
As for attributing burnout as the core issue here, I would strongly disagree with this idea. When teaching undergrads I noticed immediately that a good portion of each student cohort was basically there because they were interested in making money in the future rather than exploring the ideas of computer science. They were no doubt going to get frustrated or bored and move into management or some other profession rather than continue to expand as an engineer. This is totally fine, and they are probably richer for having learned what they did, but I don't know why we can't just see this and appreciate it for what it is rather than portraying it as the drama of burnout.
It's hard to blame students when they look at job postings and all that they see advertised is positions for programmers using languages X, Y and Z, or when they see tweets and blog posts by hotshot programmers about frameworks A and B.
The entire industry focuses way too much on 'experience with tool X' as a proxy for 'technical skill Y'. It's a bit like asking a carpenter how many years of experience they have with DeWalt cordless nail guns rather than ask about their house framing skills.
Worse, industry is routinely bashing on CS degrees because they don't turn people into framework X-ready candidates. It's getting a little tiring just how little credit is given to the idea of "maybe these tools can be learned in a reasonable amount of time by people with a degree showing they can pick things up rather quickly".
Yes but I interview for a role where we dont have set languages: we're a small optimisation team working across system when they reach capacity bottleneck and we do JS, C#, Java, C++, PL/SQL fixes completely transparently, we've slowly learned that languages really dont matter all that much for performance tuning as much as programmers mishandling which are identical in all of them (hash map loops, abusive recursion, over abstraction leading to extreme indirection, serialization laziness producing giant message transfers, database transaction mismanagement, zero profiling skills, and the main one: please stop frigging saying it's the garbage collector and look at your crap for real lol).
When we recruit we ask "if we tell you no more Spring and we'll work on any language as long as there's a problem to solve, how do you feel". Most either say they d feel horribly sad, or dont even comprehend how it's possible. Some are indifferent because they need the job. Still looking for the guy that says "anything can be learned in a reasonable amount of time" so it s not that obvious I suppose :(
23 replies →
> industry is routinely bashing on CS degrees because they don't turn people into framework X-ready candidates.
To me that's noise and not much else.
Something I don't think a lot of folks realize is that there's two parallel industries (and pipelines to jobs in the industry). They almost never overlap.
One in which recruiting is largely done by non-technical folks who match keywords of frameworks, and where rote and learning whatever library is seen as the objective (bootcamps come to mind).
The other one where CS fundamentals are seen as the priority, and where hiring focusses on finding people who posses the skill of acquiring new knowledge.
You can guess in which one Google and FAANG or whatever the acronym is now and Stanford/MIT exists.
2 replies →
TBH CS degrees don't really produce people who can code in general either. As far as I can tell they take people who already understand the basics and teach them to formally communicate.
4 replies →
> It's a bit like asking a carpenter how many years of experience they have with DeWalt cordless nail guns rather than ask about their house framing skills.
This kind of logic only works for tech organizations that already have enough in-house domain expertise to onboard new programmers. The other day somebody asked me how to find a programmer to implement something for them. From a programming standpoint there was very little to do, but it involved many obscure technologies that you couldn't pickup in a day (and no you can't pick different technologies). For a person who's already done something similar it'd be a quick and easy job, shouldn't cost too much. With a generic programmer it'd take much longer, cost much more and you couldn't be sure they'd actually deliver.
Finding someone "experienced with specific tool set" is also not a trivial problem.
It is even harder if you don't even know how to verify that person really is experienced with specific tool set.
So it goes like "pick your poison", hire generic dev and account for learning curve or spend money on recruiting fees/time for finding an expert. Where I would not be so quick to say "expert shouldn't cost too much" - because I can imagine expert taking less time but costing orders of magnitude more than generic dev.
>The entire industry focuses way too much on 'experience with tool X' as a proxy for 'technical skill Y'
Strongly emphasizing this. This is HIGHY applicable to the analytics environment. As a business analyst who specialized in mostly ad-hoc development because it was the most value-add area at the companies I worked with.. I had a lot of trouble finding new work because I didnt use Tableau, or Power BI, or Looker, etc. I was some sort of fool for doing everything in SQL, Excel, and Python.
IMO the tools are great, and you need a much lower level understanding of analytical concepts to get value from them. But for some reason people kept getting the impression I would somehow be less effective with them because I dont use them. And I had trouble correcting them with the limited bandwidth that exists to communicate with a single applicant in the hiring process. If I tried to get right to the point, I felt myself appearing arrogant.
The carpentry analogy is very similar to how i described it. "I am currently using a ruler and screwdriver, and these tools provide lasers and power drills"
Closer to hiring an electrician certified in Australia to work in the US as a carpenter.
Give them six+ months to train up and I’m sure they’ll do fine.
It's interesting because at least in terms of working professionals, of the most productive professions I've worked with, the ones who focus on "meta-level" concepts are usually the ones who overthink every detail and get very little work done and ultimately they are the ones who burn out.
They tend to bike-shed details, take way too long to try to create sophisticated abstractions that never quite achieve the silver bullet they originally thought it would, and spend too much time dwelling on irrelevant details that ultimately leads no where and results in a kind of paralysis that can be very hard to break out from.
The ones who master a specific language or master a specific library/technology that focuses on doing a few things very well and in very concrete terms are able to deliver the most business value. Furthermore they absolutely have the ability to take their mastery of that and map it to new languages or new libraries. I mean I'm not talking about going from Java to Haskell or Haskell to Idris, but rather people who master C++ can fairly easily pick up Java or Python or TypeScript. People who have mastered Unity can easily pick up Unreal Engine. People who have mastered web development can easily pick up mobile development.
The idea that people who have a solid mastery of a technology and a programming language are somehow just stuck and unable to take those skills and apply them to other areas I think is overstated and just untrue, but those who treat software engineering as highly theoretical and focus on abstractions, design principles and get caught up on these high level details tend to not get much done and when they realize that software is not as clean and elegant as they would like it to be, they get burned out and give up.
I think going over any substantial codebase for products that are widely used and deliver solid business value on Github where most code is not at all reflective of the ideals often espoused on blog posts validates my point of view.
In short, people who treat software as just a tool to accomplish a concrete task are more productive than those who write software for the sake of writing software. They don't write the cleanest code, or the most elegant data structures and algorithms, but they produce the greatest amount of tangible business value.
If your comment does anything for me its to show how terribly few words we have to discuss these things.
> "meta-level" concepts
I'd say having a strong grasp of what you can achieve with just using files and folder, or understanding how SQL solves en entire problem space are meta level concepts. Its just that we take them for granted.
> business value
Is apparently something different than 'value', but still includes every software ever that was valuable to a business?
> high level details
...?
> software engineering
Building constraint solver for a compiler or ensuring a JS animation centers a div?
> highly theoretical and focus on abstractions, design principles
I'd recognize all these things. But out of context 'in the general case' they become meaningless.
---
I understand the picture you are trying to paint, but i don't think it tells anything beyond "I've noticed people make things overly complex". I agree.
However, keep in mind the 'get things done and provides value' software you've seen: is the software 'that survived', might have been set up by a very experienced person ( whose failures we're not seeing ), nobody might recognize it as being not-simple ( e.g. I've seen high value business software partial recreate 'regex'. Worked great, straightforward and easy to read function, just ~50 lines or so, could have been a single function call. ), how the requirements are presented is hugely important.
I think no one was writing about ones who master specific language.
There is a lot of people who learn just a surface without going deep into tool and think they know enough.
For me it seems that someone who would really go deep into learning language would get most of theoretical stuff on the way. Because there is no way to really master C++ or really master Java without learning about data structures and all kinds of "meta-level" concepts.
Maybe the difference is mostly approach to learning more practical/more theoretical.
I understand the concept here but there is also a level of right tool for the job.
Some guys see a screw and reach for their trusty hammer. Some guys know to grab a screwdriver.
I had a project the last two weeks where the code was just going to fail about as often as it was going to succeed. I had to write a resource manager and an Erlang style supervisor and use an embedded key value store.
A better dev may have intuited what took me basically a midstream rewrite to figure out, a worse developer may still be grinding on the problem.
I think my solve is "robust enough" but there was no real way to power through that. You either found the right abstractions or you didn't.
Not to be rude, but did you finish reading the article? The whole point was that high-aptitude learners give up. In fact I don't agree that re-learning the same tasks over and over with the zillionth framework iteration is a rewarding learning experience. It makes perfect sense to change careers instead.
And as music goes, you sound like the record companies that thought everyone should listen to disco for the next 50 years...
As I’ve gotten older it’s harder for me to learn an entirely new area (like say going from web dev to mobile or ML). But it’s actually easier to learn a new variation of something (like the latest JS framework) because it’s usually pretty similar to one of the things I already know. I guess this leads to increasing specialisation, but it also means studies that merely count “new skills” will be misleading if they don’t differentiate in which way the skills are new.
I'm the complete opposite. Hand me a new JS framework that does the same thing I've done a million times but have to learn it's opinionated abstraction set that's somehow better and I just turn off. I simply do not care, at all. You need to simply explain to me the improvement you're proposing or it might as well be trash to me.
Now give me a new theoretical concept where I can expand my knowledge or integrate into my knowledge map and view of the world and I'm excited, there aren't enough hours in the day. Tell me about this all new concept I wasn't familiar with--I'll start thinking of ways I can use it, how I can leverage it, or how it may connect with other ideas and concepts I have.
Now give me a tight deadline which most business environments create and I agree with you, give me the boring stuff I can pump out, get my paycheck and go home to enjoy the rest of my day.
> No doubt the people who choose to stay in software are likely to be people who are curious, life-long learners
The article showed the opposite effect though. Curious, life-long learners stop working in software development because they have to constantly learn new skills and believe they can get more bang for their buck when they can invest in skills that don’t lose their value over time.
I once got excited about ExtJS, the way it created a desktop-like experience in the browser, and I said to myself, "I will learn this, all of it, I will become an expert. Tips and tricks, best practices, the works".
After six months of this, ExtJS 4 came out, which was essentially a totally new framework. Everything I learned was not only not applicable, it had to be actively unlearned.
The lesson here is: become good and proficient at something, but don't focus on becoming a ninja in one particular transient tech. There is value in becoming a Jedi of Unix build-in tools, or more persistent technologies like Git, for example.
Also, this is a bigger problem in the Javascript echosystem, where the hype cycles are more intense than in, say, Python. I checked out my Flask project from seven years ago and it's ready to rock.
I get the thing about constant learning, but learning in this industry used to be cumulative. Now it's a hamster wheel. You are learning how to solve the same problems, in a different, presumably in a "new" way.
People seem to be spending more time coming up with catchy names for their projects than making sure this is all sustainable.
Yes, this is how I feel. I have no problem learning a new skill. I get discouraged when I learn a new skill and just when I start to get really comfortable and productive with it, it's suddenly "legacy" and some new thing is popular.
The only skills that have really stood the test of time for me are C, PHP, unix shell stuff, and SQL.
It's a mix of both. You need to have solid fundamentals and need to keep learning new ways to apply those fundamentals in the real world. There is absolutely effort involved in learning a new language, library, framework, platform no matter how good you otherwise are.
> I noticed from teaching computer science while a graduate student is that the poor (a) students think about languages and libraries as key skills, while the (b) better students think about the data structures, algorithms, and design principles.
The truth is that the programmers in group (b) think about both. Who's designing a lot of the new languages, libraries, and frameworks? Chances are it was someone from group (a). If you're in group (b) then do you want to spend your whole career being forced by your bosses to constantly relearn and follow the latest vogue vision from group (a)? Of course not. So this might not apply to students, but everyone from group (b) will eventually get burned by fads enough times that they start caring about the politics of software. Namely, not wanting to depend on bloat that doesn't actually solve computer science and systems engineering problems. Group (b) might even create alternatives themselves. Go is great example. The guys who built the Fifth Bell System watched their vision behind their techniques decline over the decades and said, enough is enough. So they made Go and it was like a ray of sunshine when it came out.
I actually find the model in the article pretty convincing despite agreeing with you on that it's a profession for people who like to learn knew things and that university/grad school should teach more generic, theoretical knowledge that depreciate slower.
However, these still don't invalidate the main point of the article, that a faster rate depreciation means that your max knowledge level, given your specific rate of learning, will be lower. I.e. your advantage over a less skilled, younger professional will be lower.
And you may say that learning a new 3D library shouldn't be counted as learning a new skill, but it doesn't make the problem go away. If anything, it underlines it: if you have to start working with a new 3D library then you will have to spend time and effort on learning it (to become efficient at using it) while if you were able to keep using it, you could spend that time and effort on learning something that we could count as a new skill.
The article is also hitting on the fact that your skill premium as an engineer has a cap, and so does your willingness to burn the midnight oil on a project. This means that as time goes on, as an engineer you'll face the following
- A younger engineer will have the same value to your employer as you do.
- A younger engineer will work harder than you are willing to.
These two items are inevitable given the current rate of change in the industry. While some engineers will find next level differentiated work to engage in such as leading a core piece of infrastructure which defines the changing field... Many will not. If the rug gets pulled on this core piece of infrastructure.. then it's often the case that the engineers are not particularly more skilled than others on brand new projects.
Well, that's not what the data is showing. Why are you trying to create a narrative that is not trying to explain what we observe? Smarter people leave the field earlier and the author offers a compelling explanation why.
> When teaching undergrads I noticed immediately that a good portion of each student cohort was basically there because they were interested in making money in the future rather than exploring the ideas of computer science.
In my school, those who wanted to make money went straight to management or finance. Computer science was for the passionate ones and probably not the right path to make money for the brightest students.
> the poor students think about languages and libraries as key skills
well so do the recruiters, they’ll be fine
in fact, the better students are the ones wasting their time unless they prefer to be in academia, like you
so what metric are you really gauging for?
the “poor students” are pivoting for money and the name of the university to boost their employment prospects, maybe this shows in their academic performance and ability to understand, I saw the same in undergrad
> they are correct that we are in a dynamic profession that requires constant learning and expanding
Not true. I have met many developers who haven't learned anything new for 15+ years and are still doing just fine developing software. A lot of Java developers come to mind. They have pretty much done the same thing their whole career and have no need or desire to learn anything new.
Once you understand how the industry churns and burns and doesn't really give much credence for capability as you age - it becomes disheartening to try and want to be an IC. Most people see the writing on the wall for being an older IC therefore they move into management or product or other roles.