Comment by pbhjpbhj
8 years ago
Yes, but then you need special equipment rather than standard cameras (that usually filter IR); which would prove the point that it's about physical limitations rather than racism.
8 years ago
Yes, but then you need special equipment rather than standard cameras (that usually filter IR); which would prove the point that it's about physical limitations rather than racism.
The racism comes from saying "We recognise human faces" and then not recognising a significant proportion of humans.
That's not racism, that's laziness. Or perhaps just being a bad engineer, if you want to be more cynical about it.
It's not that hard to find both conscious and subconscious examples of systemic or individual racism. Engineers taking the easy path with webcam facial recognition is probably not a good one and serves only to give more fuel to those who claim people jump to the racism cry too quickly.
If you ship a product because it only works for white people and that's good enough, that is definitely racism. Just as it would be sexism if you shipped a voice recognition product that only worked for men.
Laziness and racism aren't mutually exclusive. Historically in America, white people haven't considered black people as fully human. You can read the various declarations of secession or the 3/5ths compromise for that. Or the long post-Reconstruction history of slightly more subtle ways.
Sure, laziness was involved here. But deciding a product was good enough to ship without caring that it worked for black people requires the effective belief that black people didn't really count as people. At least, not people that mattered. Imagine the reverse: if the product didn't work on white men, would it have been shipped? Of course not.
When laziness just happens to have a blatantly racist outcome in a place where there is a centuries-long history of racism, Occam's Razor suggests the explanation is racism. If laziness could not have caused the bad outcome to happen for white people, then it's pretty clear that pure laziness is not the real cause. It's instead white people being differentially lazy when it comes to black people. That's clearly racism.
As a confirmatory example, look at the American justice system. In a lot of places and times, the same nominal laws applied to white and black people. But they were enforced very differently. Serious crimes against black people were ignored. Minor infractions by black people were enforced vigorously. [1] Were the cops lazy? Sure, everybody's lazy sometimes. Was that why there was a racially different outcome? Definitely not.
[1] Examples of this are all over Loewen's "Sundown Towns" for example.
9 replies →
> That's not racism, that's laziness. Or perhaps just being a bad engineer, if you want to be more cynical about it.
It can be both. An attitude that considers facial recognition software acceptable if it only recognizes light-skinned people is both lazy and racist. Only choosing to test on light-skinned people is racist because doing so assumes dark skinned people are an exception or an outlier, rather than an equally valid part of the set of "human faces."
2 replies →
> That's not racism, that's laziness. Or perhaps just being a bad engineer, if you want to be more cynical about it.
I wasn't ready to admit the genius of the show before, but I am now: what you said is quite close to a line from the same episode (said by the sociopathic boss, Veronica) when told by the protagonist that the sensors were effectively racist "[...] it's actually the opposite of racist, because it's not targeting black people. It's just ignoring them. [company] insist the worst people can call it is "indifferent.""
I agree. It's a well known problem where the training set isn't representative of the underlying population. While it can certainly be argued that the engineers should have recognized this deficiency and taken corrective action, I really don't understand why all the respondents to your post are so quick to assert racist intent based on clickbait headlines from Forbes.
3 replies →
But laziness around something that dispropotionately impacts people based on their race is racist. If you know your system has a harder time training against non-white faces, and you choose to train it only against white faces to be lazy, that's still explicitly racist. This isn't even a case of negligent racism ("oh well I didn't know my system wouldn't work for people with darker skin because only my white coworkers tried it out"). The example here is a case of the engineer explicitly deciding to avoid the harder case of darker skinned individuals, knowing that the results would be poorer for those individuals, and thinking it doesn't matter if the system doesn't work well for people with darker skin. That is explicit racism.
1 reply →
That's not racism, that's laziness.
If the system had failed for with people with different hair color rather than skin color would they have been equally lazy?
2 replies →
Actually, even if it is accidental or subconscious, I'd say this is a perfect example of systemic racism--racist behavior (I'm referring to an algorithm/device specifically here, which you can safely call objectively racist) which is normally within society's acceptance level which, when magnified to a societal scale, is no longer acceptable. Another symptom of this illness is that you might not even notice as a member of this system the system is broken if you're white, but you would if you're black.
Personally, I think it could be understandable people didn't consider race when developing facial recognition technology, especially when we've only had mainstream awareness of this for under a decade and many people live in racially homogeneous or dominated cultures. However, I don't think it's acceptable for organizations, and the time when you can safely say you didn't understand biased learning data will be over soon. There are considerations you need to make scaling your tech from personal project to something the public will consume.
Also, the day will come when computers can point out racist stuff better than the average human can now, albeit with a high false positive rate. I say this because it's relatively easy; even if you only count a subset of tweets talking about racism as not trolling, that's still a shockingly high number of meaningful things about the world many people aren't seeing.
Let's just say that if a system didn't recognize white people's faces, it probably wouldn't be viewed as "done".
If it's not worth the trouble to see if your system works for non-whites, that's beyond lazy.
Like the soap dispensers that don't work for dark skinned people.
It is racism to realize the negative aspects to other races, nevertheless pitch it as universally functional, and push the cost/pain to other races. Laziness is some of it, some of it is the realization that only other races bear the brunt of the cost.
If it doesn't occur to you that there are black people, or you are too lazy to test against then, then that is racism.
No, that is racism. It's not burning a cross on someone's lawn racism, but flat out ignoring a large swath of humanity is racism.
Most cheap cameras actually don't filter out IR; it often shows up as blue or purple. You can use this fact to see if an IR remote is working -- just shine it at your cellphone camera or webcam. Whether there's enough IR sensitivity to be able to illuminate a face is another question.