← Back to context

Comment by placardloop

2 days ago

“Real problems” aren’t something that can be effectively discussed in the time span of an interview, so companies concoct unreal problems that are meant to be good indicators.

On that, these unreal questions/problems are decent proxies for general knowledge for humans, but not for AI. Humans don't have encyclopedic knowledge, so questions on a topic can do a decent job of indicating a person has the broader depth of knowledge in that topic and could bring that to bear in a job. An AI can answer all the questions but can't bring that to bear in a job.

WE saw this last year with all the "AI can now pass the bar exam" articles, but that doesn't lead to them being able to do anything approaching practicing law, because AI failure modes are not the same as humans and can't be tested the same way.

Really? How short are your interviews, and how big are these Real Problems such that you can't get a sense of how your candidate would start to tackle them?

  • The “real problems” most companies want people to help solve involve the evolution of products that last for years, involve repeated design discussions, in depth research, and applying retrospective learning. I don’t need someone that can just glue a Rails API together. If I did, I can literally just download that from the internet for free.

    If my problems could be solved in the time span of an interview, why would I waste my time doing that interview instead of just solving it?

    • I don't see the issue here. Nobody expects candidates to build actual product during the interview. Having a (targeted, scope and time-limited) design discussion or giving your candidate some made-up context around an engineering cycle and then doing a retrospective with them are practical and useful ways to interview a candidate.

      I'm also not sure what the alternative is? Just not hiring?

      5 replies →