Comment by kakacik
7 hours ago
They are not taken into account simply because they may be meaningless to job advertised.
For example no company I ever worked for (maybe 7 over 20 years) needed publishing academics. Is the fact that you published a book about some DB architecture 10 years ago going to make you a better team member and deliver things faster? Maybe, or maybe not, how could hiring folks know? You measure and judge what you can, ignore the rest since it can be + or - and you have no clue, not even about probability spread.
Doctors with many licenses may be better, or maybe they like studying more than actually working with patients and thus be possibly even worse than average, how do you want to measure that? A stellar doctor may simply have no time to do anything beyond her/his current work due to also dedicating time to their family and raising children. Quality of a doctor, especially surgeons is mostly function of work already performed (knowing closely tens of doctors from all branches they themselves often claim this), not some extra papers. And so on and on.
Yes, most employers in tech are not looking for published academics. That’s a bit of an oversimplification.
The actual problem is that developers are made into a commodity because they are an unvalued cost center. Developers do not make money. They only take money. So, the goal is to not train them and replace them as conveniently as possible.
The result is to not look for talent but instead equate the requirements to a lowest common denominator. That minimizes personnel management friction at cost to everything else. We all know this. The real concern for developers is why that is allowed to occur in software but not other professions.