Comment by ryao
2 days ago
As someone who has reviewed people’s résumés that they submitted with job applications in the past, I find it difficult to imagine this. The résumés that I saw had no racial information. I suppose the names might have some correlation to such information, but anyone feeding these things into a LLM for evaluation would likely censor the name to avoid bias. I do not see an opportunity for proactive safety in the LLM design here. It is not even clear that they even are evaluating whether there is bias in such a scenario when someone did not properly sanitize inputs.
> I find it difficult to imagine this
Luckily, this is something that can be studied and has been. Sticking a stereotypically Black name on a resume on average substantially decreases the likelihood that the applicant will get past a resume screen, compared to the same resume with a generic or stereotypically White name:
https://www.npr.org/2024/04/11/1243713272/resume-bias-study-...
That is a terrible study. The stereotypically black names are not just stereotypically black, they are stereotypical for the underclass of trashy people. You would also see much higher rejection rates if you slapped stereotypical white underclass names like "Bubba" or "Cleetus" on resumes. As is almost always the case, this claim of racism in America is really classism and has little to do with race.
"Names from N.C. speeding tickets were selected from the most common names where at least 90% of individuals are reported to belong to the relevant race and gender group."
Got a better suggestion?
> but anyone feeding these things into a LLM for evaluation would likely censor the name to avoid bias
That should really be done for humans reviewing the resumes as well, but in practice that isn't done as much as it should be