Comment by sarchertech
6 months ago
>There it goes, your other metric.
1. It’s impossible to measure that except in aggregate
2. The only thing you can actually measure is how soon did it launch because the quality of the thing is too wrapped up in what it’s supposed to do.
How many bugs were there? That’s heavily dependent on how complicated the problem it’s solving.
The absolute best thing anyone has come up with is velocity, which just measures how good a team is at estimating. That also only works at the team level and it’s only accurate for teams repeatedly doing similar work that haven’t changed in composition for years.
> You can't accurately measure any coastline, but we don't stop at that.
We actually can accurately measure the relative lengths of coastlines such that we can compare them by picking a measurement resolution and using that for all of them.
You can’t do that here. It doesn’t work.
If we could accurately map business value creating back to ICs, tech companies would look very very different.
Even if you could objectively measure how good an IC is at meeting objectives. The developers I’ve met who were best at that metric tended to be terrible at their jobs in general because they’d just take anything product wanted and build it exactly as described.
So you want some amount of pushback. But not too much or they just end up not doing anything.
Try to come up with a metric for that. To do that though it needs to be normalized by how good the PM is at designing features. If the PM was terrific, you wouldn’t want pushback.
So now you have to rate all your PMs before you can rate your ICs. Then you need to rate the PMs bosses and so on. Until you have an uncomputable mess.
The best you can do is have enough EMs so they can get a feel for how ICs are doing. You can also probably expose the absolute worst performing engineers with metrics because they’re just literally doing nothing. But even then, you’ll get some false positives.
No comments yet
Contribute on Hacker News ↗