Comment by local_yokel
8 years ago
I agree. It's a well known problem where the training set isn't representative of the underlying population. While it can certainly be argued that the engineers should have recognized this deficiency and taken corrective action, I really don't understand why all the respondents to your post are so quick to assert racist intent based on clickbait headlines from Forbes.
It's not an active racism on the part of the engineers. It's way more subtle than that.
There could be a hidden assumption on the part of the engineering team that light-skin is "normal" and anything else is a special case. Nobody is saying "I hate black people" or anything of the kind.
Or, as happened with a voice recognition system a former employer used, it was tested on the engineering staff, who happened to be all male. As a result it didn't work well for most women who tried to use it. There was no intentional exclusion of women from the test data, and I'd argue, no intentional exclusion of women from the engineering teams. But it is reasonable to say that systemic sexism that excludes women from engineering careers helped this system fail.
These kinds of problems are difficult to solve, because they aren't active decisions on anyone's part. They evolve out of pervasive conditions, and unconscious biases. At root, the source is still racism, or sexism (or another form of discrimination).
Because it is racism. It is ignoring a large swath of humanity based on the color of their skin. You may not want to think of it as racism because it's not the burn a cross on their lawn type of racism, but it's still systemic racism.
> It is ignoring a large swath of humanity based on the color of their skin.
No, it's an insufficiently sensitive contrast filter combine with too narrow a training set. Screaming racism at everything that is even marginally approaching the topic of race detracts from real racism and ignores the actual issue here - shitty software.