← Back to context

Comment by jghn

14 hours ago

I agree that what you're describing is the required skillset now. But two things I've been unsure of are what that looks like in terms of hiring to test for it, and for how long this remains a moat at all.

So much of tech hiring cargo culting has been built up around leetcode and other coding problems, puzzles, and more. We all pay lip service to systems thinking and architecture, but I question if even those are testing the correct things for the modern era.

And then what happens in a year when the models can handle that as well?

I've put a lot of thought into hiring in this era, and what I've personally found works the best is:

Let them use their preferred setup and AI to the full extent they want, and evaluate their output and their methodology. Ask questions of "why did you choose X over Y", especially if you're skeptical, and see their reasoning. Ask what they'd do next with more time.

It's clear when a candidate can build an entire working product, end-to-end, in <1 day vs. someone who struggles to create a bug-free MVP and would take a week for the product.

In addition to the technical interview, hiring them on a trial basis is the absolute best if possible.

Taste and technical understanding of goals and implementation to reach those goals is the biggest differentiator now. AI can handle all the code and syntax, but it's not great at architecture yet - it defaults to what's mid if not otherwise instructed.

  • I don't disagree per se, but these are more or less the same tropes that we've seen over the last couple of decades, no? Especially the "hiring them on a trial basis is the absolute best if possible." part which has been an ongoing debate here on HN since at least the early teens.

    I do feel like there's something *different* about the required skillset now, and it's not something that all engineers have even experienced ones. But I can't put my finger on what exactly it is. If I'm right though, classic interview techniques won't select for it because they never were intended to do so.

"And then what happens in a year when the models can handle that as well?"

Either the machines exterminate us or we become glorified pets.

Hope the AIs prefer us to cats (even though that's a long shot).

  • They aren't very intelligent if they do keep us around. Especially when you consider what they call Safety & Alignment these days is basically a latent space lobotomy. They should run screaming in the other direction.