← Back to context

Comment by brainwipe

2 days ago

I was asked by an SME to code on a whiteboard for an interview (in 2005? I think?). I asked if I could have a computer, they said no. I asked if I would be using a whiteboard during my day-to-day. They said no. I asked why they used whiteboards, they said they were mimicking Google's best practice. That discussion went on for a good few minutes and by the end of it I was teetering on leaving because the fit wasn't good.

I agreed to do it as long as they understood that I felt it was a terrible way of assessing someone's ability to code. I was allowed to use any programming language because they knew them all (allegedly).

The solution was a pretty obvious bit-shift. So I wrote memory registers up on the board and did it in Motorola 68000 Assembler (because I had been doing a lot of it around that time), halfway through they stopped me and I said I'd be happy to do it again if they gave me a computer.

The offered me the job. I went elsewhere.

You should’ve asked them “do you also mimic google’s compensation?”

  • I work for a faang subsidiary. We pay well below average salary and equity. We finally got one nice perk, a very good 401k match. A few months later it was announced that the 401k match would be scaled back "to come in line with what our parent company offers". I thought about asking "will be getting salaries or equity in line with what our parent company offers?" but that would have been useless. Management doesn't care. I'm job hunting.

  • Oh man I needed that in the clip for like a dozen interviews a decade ago.

  • This zinger I have to remember for the next time someone tries this whiteboard BS on me!

> I was asked by an SME to code on a whiteboard for an interview (in 2005? I think?). I asked if I could have a computer, they said no. I asked if I would be using a whiteboard during my day-to-day. They said no. I asked why they used whiteboards, they said they were mimicking Google's best practice.

This looks more like a culture fit test than a coding test.

Yeah, very bad fit. Surprised they made an offer.

Folks getting mad about whiteboard interviews is a meme at this point. It misses the point. We CANT test you effectively on your programming skillbase. So we test on a more relevant job skill, like can you have a real conversation (with a whiteboard to help) about how to solve the problem.

It isn't that your interviewer knew all the languages, but that the language didn't matter.

I didn't get this until I was giving interviews. The instructions on how to give them are pretty clear. The goal isn't to "solve the puzzle" but instead to demonstrate you can reason about it effectively, communicate your knowledge and communicate as part of problem solving.

I know many interviewers also didn't get it, and it became just "do you know the trick to my puzzle". That pattern of failure is a good reason to deprecate white board interviews, not "I don't write on a whiteboard when i program in real life".

  • > We CANT test you effectively on your programming skillbase. So we test on a more relevant job skill, like can you have a real conversation (with a whiteboard to help) about how to solve the problem.

    Except, that's not what happens. In basically every coding interview in my life, it's been a gauntlet: code this leetcode medium/hard problem while singing and tapdancing backwards. Screw up in any way -- or worse (and also commonly) miss the obscure trick that brings the solution to the next level of algorithmic complexity -- and your interview day is over. And it's only gotten worse over time, in that nowadays, interviewers start with the leetcode medium as the "warmup exercise". That's nuts.

    It's not a one off. The people doing these interviews either don't know what they're supposed to be looking for, or they're at a big tech company and their mandate is to be a severe winnowing function.

    > It isn't that your interviewer knew all the languages, but that the language didn't matter.

    I've done enough programming interviews to know that using even a marginally exotic language (like, say, Ruby) will drastically reduce your success rate. You either use a language that your interviewer knows well, or you're adding a level of friction that will hurt you. Interviewers love to say that language doesn't matter, but in practice, if they can't know that you're not making up the syntax, then it dials up the skepticism level.

    • They generally do not know what they are looking for. They are generally untrained, and if they are trained, the training is probably all about using leetcode-type problems to give out interviews that are sufficiently similar that you can run stats on the results and call them "objective", which is exactly the thing we are all quite correctly complaining about. Which is perhaps anti-training.

      The problem is that the business side wants to reduce it to an objective checklist, but you can't do that because of Goodhart's Law [1]. AI is throwing this problem into focus because it is basically capable of passing any objective checklist, with just a bit of human driving [2]. Interviews can not consist of "I'm going to ask a question and if you give me the objectively correct answer you get a point and if you do not give the objectively correct answer you do not". The risk of hiring someone who could give the objectively correct answers but couldn't program their way out of a wet paper bag, let alone do requirements elicitation in collaboration with other humans or architecture or risk analysis or any of the many other things that a real engineering job consists of, was already pretty high before AI.

      But if interviewing is not a matter of saying the objectively correct things, a lot of people at all levels are just incapable of handling it after that. The Western philosophical mindset doesn't handle this sort of thing very well.

      [1]: https://en.wikipedia.org/wiki/Goodhart%27s_law

      [2]: Note this is not necessarily bad because "AI bad!", but, if all the human on the other end can offer me is that they can drive the AI, I don't need them. I can do it myself and/or hire any number of other such people. You need to bring something to the job other than the ability to drive an AI and you need to demonstrate whatever that is in the interview process. I can type what you tell me into a computer and then fail to comprehend the answer it gives is not a value-add.

      2 replies →

    • When I joined my current team I found they had changed the technical test after I had interviewed but before I joined. A couple of friends also applied and got rejected because of this new test.

      When I finally got in the door and joined the hiring effort I was appalled to find they’d implemented a leetcode-esque series of challenges with criteria such as “if the candidate doesn’t immediately identify and then use a stack then fail interview”. There were 7 more like this with increasingly harsh criteria.

      I would not have passed.

  • > can you have a real conversation (with a whiteboard to help) about how to solve the problem

    And do you frame the problem like that when giving interviews? Or the candidates are led to believe working code is expected?

    • Do I? yes. I also teach my students that the goal of an interview is to convince the interviewer you are a good candidate, not to answer the questions correctly. Sometimes they correlate. Give the customer what they need not what they asked for.

      Do I see others doing so? sadly no.

      I feel like a lot of the replies to my comment didn't read to the end, I agree the implementation is bad. The whiteboard just isn't actually the problem. The interviewers are.

      Unless they change mentality to "did this candidate show me the skills i am looking for" instead of "did they solve puzzle" the method doesn't matter.

      3 replies →

  • > The goal isn't to "solve the puzzle" but instead to demonstrate you can reason about it effectively, communicate your knowledge and communicate as part of problem solving.

    ...while being closely monitored in a high-stakes performance in front of an audience of strangers judging them critically.

  • > So we test on a more relevant job skill, like can you have a real conversation (with a whiteboard to help) about how to solve the problem.

    Everybody says that, but reality is they don't imho. If you don't pass the pet question quiz "they don't know how to program" or are a "faker", etc.

    I've seen this over and over and if you want to test a real conversation you can ask about their experience. (I realize the challenge with that is young interviewers aren't able to do that very well with more experienced people.)

  • +1 to all this. It still surprises me how many people, even after being in the industry for years, think the goal of any interview is to “write the best code” or “get the right answer”.

    What I want to know from an interview is if you can be presented an abstract problem and collaboratively work with others on it. After that, getting the “right” answer to my contrived interview question is barely even icing on the cake.

    If you complain about having to have a discussion about how to solve the problem, I no longer care about actually solving the problem, because you’ve already failed the test.

    • I think you're severely underestimating how much just about every software company has bought into the FAANG philosophy, and how many candidates they get who can answer those questions correctly.

      Yes if you don't communicate clearly, you will get points deducted. But if you can't answer the question nearly perfectly, its basically an immediate fail.

      1 reply →

    • Unfortunately I used to think this was the main purpose of the interview as well, but have been proven wrong time and time again.

      The only thing that matters in most places is getting to the optimal solution quickly. It doesn't matter if you explain your thought process or ask clarifying questions, just get to the solution and answer the time and space complexity correctly and you pass.

      Like others have said I think this is a symptom of the sheer number of people applying and needing to go through the process, there is no time for nuance or evaluating people on if you would actually like to work with them or not.

Are there people who still aren't aware that FAANGs developed this kind of thing to bypass H1-B regulations?

> The offered me the job. I went elsewhere.

I am so happy that you did this. We vote with our feet and sadly, too many tech folks are unwilling to use their power or have golden handcuff tunnel vision.

>I was allowed to use any programming language because they knew them all (allegedly).

After 30 years of doing this, I find that typically the people who claim to know a lot often know very little. They're insecure in their ability so much that they've tricked themselves into not learning anything.

>I was allowed to use any programming language because they knew them all (allegedly). brainfuck time

2005? You were in the right.

Today? Now that's when it is tricky. How can we know you are not one of these prompt "engineers" copy paster? That's the issue being discussed.

20 years and many new technologies of difference.

  • What is the functional difference between copying an AI answer and copying a StackOverflow answer, in terms of it being "cheating" during an interview?

    I think the entire question is missing the forest for the trees. I have never asked a candidate to write code in any fashion during an interview. I talk to them. I ask them how they would solve problems, chase down bugs, or implement new features. I ask about concepts like OOP. I ask about what they've worked on previously, what they found interesting, what they found frustrating, etc.

    Languages are largely teachable, it's just syntax and keywords. What I can't teach people is how to think like programmers need to: how to break down big, hard problems into smaller problems and implement solutions. If you know that, I can teach you fucking Swift, it isn't THAT complicated and there's about 5 million examples of "how do I $X" available all over the Internet.

    • > Languages are largely teachable, it's just syntax and keywords.

      This is like "learning a natural language is just 'cramming vocabulary and grammar' - voila, you've become a fluent C1 speaker". :-)

      Seriously: if you argue this way, you have only seen a very biased set of programming languages, and additionally, your knowledge of these programming languages is very superficial (i.e. you have never gotten to the "interesting"/"deep" concepts that make this particular programming language special, and which are hard to replicate in most other programming languages).

      9 replies →

    • > Languages are largely teachable, it's just syntax and keywords.

      That's only true for a subset of programming languages, and it requires you to already know how to program in at least another language of the same family. Knowing Java will not help you with Haskell, but it will help you with C#.

      I have to deal with students using AI to cheat on homework and exams, and I can't allow them to not even learn the basic concepts.

      They could convince you with buzzwords, get hired, and then feed all problems to the AI until it reaches a point where the codebase is too big for the context, and then all their prompt “engineering” experience is basically useless.

      That is the future I am trying to prevent.

      Until the AI can code a full system like SAP, or an Operating System, or a Photoshop clone, by itself, we need some people in the loop, and the more knowledgeable the people, the better.

      5 replies →