← Back to context

Comment by d1str0

3 months ago

That’s literally what this whole article was about. Removing a high correlation performance test, that black candidates didn’t pass as frequently, and replacing it with a very low correlation questionnaire that provided a more diverse applicant pool while weeding out highly qualified individuals.

Exactly. From the article: "As originally scored, the test was intended to pass 60% of applicants, but predictions suggested only 3% of black applicants would pass"

They still had to pass the performance test. It was just no longer the first step in the process. I want to be clear, that doesn't mean the questionnaire was a good thing. It just means that the questionnaire did not lower the bar.

Instead it reduced the applicant pool in a sudden and unfair manner, which is it's own issue.

  • No, read the article again. They didn't need to pass the same test to the same degree - the criteria was also changed to have "qualified" and "well qualified".

    • It's worth nothing that this change happened before the questionnaire was instituted. (The paper referenced in the article was from 2006, I haven't dug enough to find a date for when this change was made, but the narrative in the article also establishes this act as happening in the '00s.) Additionally, from the Conclusions:

      "Reweighting was based on data collected from incumbent ATCSs who took AT-SAT on a research basis; some of these employees achieved overall scores less than 70 (that was one of the reasons for the reweighting effort – a belief that incumbent employees should be able to pass the entry-level selection test)."

      I don't think this proves that the update to the test was good or bad in overall competency, but I do think it's worth investigating if the test should be updated when existing employees are unable to pass.