Comment by hrimfaxi
5 hours ago
How exactly are good doctors easy to identify and hard to fire? And how does it follow that AI is a net negative when wielded by professionals who are excellent at what they do?
If people can't identify qualified professionals without relying on credentials, they probably aren't qualified to be hiring managers.
There are a couple of things that identify talent in the qualified space:
* Peer reviews in the industry
* Publications in peer reviewed journals
* Owner/partner of a firm of licensed professionals
* Quantity of surgeries, clients, products, and so forth
* Transparency around lawsuits, license violations, ethics violations, and so forth
* Multiple licenses. Not one, but multiple stacked on top of a base qualification license. For example an environmental lawyer will clearly have a law license, but will also have various environmental or chemistry certifications as well. Another example is that a cardiologist is not the same as a nurse practitioner or general physician.
Compare all of that against what the typical developer has:
* I have been employed for a long time
More elite developers might have these:
* Author of a published book
* Open source software author of application downloaded more than a million times
Those elite items aren't taken very seriously for employment consideration despite their effort and weight compared to what their peers offer.
They are not taken into account simply because they may be meaningless to job advertised.
For example no company I ever worked for (maybe 7 over 20 years) needed publishing academics. Is the fact that you published a book about some DB architecture 10 years ago going to make you a better team member and deliver things faster? Maybe, or maybe not, how could hiring folks know? You measure and judge what you can, ignore the rest since it can be + or - and you have no clue, not even about probability spread.
Doctors with many licenses may be better, or maybe they like studying more than actually working with patients and thus be possibly even worse than average, how do you want to measure that? A stellar doctor may simply have no time to do anything beyond her/his current work due to also dedicating time to their family and raising children. Quality of a doctor, especially surgeons is mostly function of work already performed (knowing closely tens of doctors from all branches they themselves often claim this), not some extra papers. And so on and on.
Yes, most employers in tech are not looking for published academics. That’s a bit of an oversimplification.
The actual problem is that developers are made into a commodity because they are an unvalued cost center. Developers do not make money. They only take money. So, the goal is to not train them and replace them as conveniently as possible.
The result is to not look for talent but instead equate the requirements to a lowest common denominator. That minimizes personnel management friction at cost to everything else. We all know this. The real concern for developers is why that is allowed to occur in software but not other professions.
> And how does it follow that AI is a net negative when wielded by professionals who are excellent at what they do?
Simple, the 80% of code monkeys who are not good at what they do will cause way more damages than the "professionals who are excellent at what they do". And out fo tech I can guarantee you the vast majority of people use llms to do less, not to do more or do better
It's also easily verifiable, supposedly AI makes everyone a 10x developer/worker, it's been ~3 years now, where are the benefits ? Which company/industry made 10x progress or 10x revenue or cut 90% of their workforce ?
How many man hours are lost on AI slop PRs? AI written tickets which seem to make sense at first but fall apart once you dig deep? AI reports from mckinsey&co which use fake sources?
The big wins I've seen with llm tools is in translation more than the actual code.
We have been able to move our "low cost" work out of India to eastern Europe, Vietnam and the Philippines; pay per worker more, but we need half as many (and can actually train them).
Although our business process was already tolerant of low cost regions producing a large amount of crap; seperate teams doing testing and documentation...
It's been more of a mixed bag in the "high skill" regions, we have been getting more pushback against training, people wanting to be on senior+ teams only, due to the llm code produced by juniors. This is completely new, as it's coming from people who used to see mentoring and teaching as a solid positive in their job.
It's really only in the last year the LLM's have gotten great and can output massive blocks of code that are functional.
LLM's are at least a 10x speed up on the 'producing code' portion of the job, but that's often only a fraction of the overall job. There's lots of time spent planning and in meetings, doing research, corporate red tape, etc.
For writing code and unit tests, generating deployment yaml and terraform, for me it's easily a 30x speed up. I can do in 1 or 2 hours what would have previously taken a week.
[flagged]
So where are the AI-induced gains?
If it is AGI and you have access to it, pivot from code now. You will make millions or billions on pharmaceutical products in as little as one month. Start making cures for non small cell cancers; those are untapped markets. Reverse aging. Even at $1mill a treatment over ten treatments, you would be sought after. Go! Get off HN! Rule the world.
1 reply →