← Back to context

Comment by alexpotato

1 day ago

Recent fascinating experience with hiring and AI.

- DevOps role

- Technical test involves logging into a server (via sadservers.com who are awesome)

- We tell the candidates: "The goal is to see if you can work through a problem on a Linux based system. It's expected you'll see some things you may never have seen before. Using Google and ChatGPT etc is fine if you get stuck. We just ask that you share your screen so we can see your search and thought processes."

- Candidate proceeds to just use ChatGPT for EVERY SINGLE STEP. "How do I list running processes?", "How do I see hidden files?", copy and pasted every error message into ChatGPT etc

Now, I had several thought about this:

1. Is this common?

2. As one coworker joked "I knew the robots were coming, just not today"

3. We got essentially zero signal on his Linux related debugging skills

4. What signal was he trying to send by doing this? e.g. I would assume he would realize "oh, they are hiring for people who are well versed in Linux and troubleshooting"

5. I know some people might say "well, he probably eventually got to the answer" but the point is that ChatGPT doesn't always have the answer.