← Back to context

Comment by DanBC

8 years ago

The racism comes from saying "We recognise human faces" and then not recognising a significant proportion of humans.

That's not racism, that's laziness. Or perhaps just being a bad engineer, if you want to be more cynical about it.

It's not that hard to find both conscious and subconscious examples of systemic or individual racism. Engineers taking the easy path with webcam facial recognition is probably not a good one and serves only to give more fuel to those who claim people jump to the racism cry too quickly.

  • If you ship a product because it only works for white people and that's good enough, that is definitely racism. Just as it would be sexism if you shipped a voice recognition product that only worked for men.

    Laziness and racism aren't mutually exclusive. Historically in America, white people haven't considered black people as fully human. You can read the various declarations of secession or the 3/5ths compromise for that. Or the long post-Reconstruction history of slightly more subtle ways.

    Sure, laziness was involved here. But deciding a product was good enough to ship without caring that it worked for black people requires the effective belief that black people didn't really count as people. At least, not people that mattered. Imagine the reverse: if the product didn't work on white men, would it have been shipped? Of course not.

    When laziness just happens to have a blatantly racist outcome in a place where there is a centuries-long history of racism, Occam's Razor suggests the explanation is racism. If laziness could not have caused the bad outcome to happen for white people, then it's pretty clear that pure laziness is not the real cause. It's instead white people being differentially lazy when it comes to black people. That's clearly racism.

    As a confirmatory example, look at the American justice system. In a lot of places and times, the same nominal laws applied to white and black people. But they were enforced very differently. Serious crimes against black people were ignored. Minor infractions by black people were enforced vigorously. [1] Were the cops lazy? Sure, everybody's lazy sometimes. Was that why there was a racially different outcome? Definitely not.

    [1] Examples of this are all over Loewen's "Sundown Towns" for example.

    • I think it's important to start talking about, and acknowledging, that nearly every second of the average person's life is thinking about something that is within their own universe of their life's experiences.

      So plausibly, this really was an innocent and understandable mistake. Maybe they grew up in a town full of white people, and maybe all of their friends and coworkers are white. I understand what they did is technically racist, but that's because of that particular combination of people. Let's use _this_ 'experience' of "racist", but not it's exact definition. For example:

      - It was racist you forgot your mother in law's birthday (who you see once a year.)

      - It was racists you didn't lift the toilet seat. (Edit: I grew up in a culture where it was offensive to not raise the seat)

      Definition: Any unintentional side effect from not thinking about someone* outside of everything you've experienced, and presumably causing harm to that outsider.

      * Usually this person is disadvantaged in some way. But almost everyone is disadvantaged in some way, thus neutralizing this particular point IMO.

      Does that sound rational? I don't think it does.

      The world is full of suffering, and I'm advocating people help and love the people they are surrounded by. If many people did that, it would be easier to recognize each other, where ever you come from. Why? Because hate will push away everything it is unfamiliar with, but love will accept everything it is unfamiliar with.

      (Now meta argument, do I expect people to change? No. So I try accepting them instead.)

      4 replies →

    • This is neat, in a conversation about the technical aspects of image recognition with webcams in low lighting conditions, you're talking about secession and the 3/5th compromise. Let's try to stay on topic.

      > But deciding a product was good enough to ship without caring that it worked for black people requires the effective belief that black people didn't really count as people.

      Nobody ever said "oh it doesn't work with black people? Who cares LOL." I was going to make a crack about mental gymnastics but this isn't even that. You're just making shit up at this point.

      > When laziness just happens to have a blatantly racist outcome in a place where there is a centuries-long history of racism, Occam's Razor suggests the explanation is racism.

      No, even using your sentence here you admit the cause is laziness but then try to reframe it to racism because that fits your world view. Just like the fact that we're talking about webcams and you're talking about the 3/5th compromise and the American justice system.

      Is there institutional/structural racism in the US? Absolutely. Webcams aren't a good example of it, sorry.

      1 reply →

    • Seeing someone interact with others after drinking enough kool-aid to fill a pool is actually pretty entertaining. Thanks, I needed a break from work.

      Speaking of work, have you tried testing your definition of racism on non-white subjects and products, such as Black Entertainment Television? It doesn't hold up very well.

      1 reply →

  • > That's not racism, that's laziness. Or perhaps just being a bad engineer, if you want to be more cynical about it.

    It can be both. An attitude that considers facial recognition software acceptable if it only recognizes light-skinned people is both lazy and racist. Only choosing to test on light-skinned people is racist because doing so assumes dark skinned people are an exception or an outlier, rather than an equally valid part of the set of "human faces."

  • > That's not racism, that's laziness. Or perhaps just being a bad engineer, if you want to be more cynical about it.

    I wasn't ready to admit the genius of the show before, but I am now: what you said is quite close to a line from the same episode (said by the sociopathic boss, Veronica) when told by the protagonist that the sensors were effectively racist "[...] it's actually the opposite of racist, because it's not targeting black people. It's just ignoring them. [company] insist the worst people can call it is "indifferent.""

  • I agree. It's a well known problem where the training set isn't representative of the underlying population. While it can certainly be argued that the engineers should have recognized this deficiency and taken corrective action, I really don't understand why all the respondents to your post are so quick to assert racist intent based on clickbait headlines from Forbes.

    • It's not an active racism on the part of the engineers. It's way more subtle than that.

      There could be a hidden assumption on the part of the engineering team that light-skin is "normal" and anything else is a special case. Nobody is saying "I hate black people" or anything of the kind.

      Or, as happened with a voice recognition system a former employer used, it was tested on the engineering staff, who happened to be all male. As a result it didn't work well for most women who tried to use it. There was no intentional exclusion of women from the test data, and I'd argue, no intentional exclusion of women from the engineering teams. But it is reasonable to say that systemic sexism that excludes women from engineering careers helped this system fail.

      These kinds of problems are difficult to solve, because they aren't active decisions on anyone's part. They evolve out of pervasive conditions, and unconscious biases. At root, the source is still racism, or sexism (or another form of discrimination).

    • Because it is racism. It is ignoring a large swath of humanity based on the color of their skin. You may not want to think of it as racism because it's not the burn a cross on their lawn type of racism, but it's still systemic racism.

      1 reply →

  • But laziness around something that dispropotionately impacts people based on their race is racist. If you know your system has a harder time training against non-white faces, and you choose to train it only against white faces to be lazy, that's still explicitly racist. This isn't even a case of negligent racism ("oh well I didn't know my system wouldn't work for people with darker skin because only my white coworkers tried it out"). The example here is a case of the engineer explicitly deciding to avoid the harder case of darker skinned individuals, knowing that the results would be poorer for those individuals, and thinking it doesn't matter if the system doesn't work well for people with darker skin. That is explicit racism.

    • Skin color is not synonymous with race. Every race has people that span a wide range of skin tones, and skin tone fades with age, so you may as well be arguing it's ageism.

  • That's not racism, that's laziness.

    If the system had failed for with people with different hair color rather than skin color would they have been equally lazy?

  • Actually, even if it is accidental or subconscious, I'd say this is a perfect example of systemic racism--racist behavior (I'm referring to an algorithm/device specifically here, which you can safely call objectively racist) which is normally within society's acceptance level which, when magnified to a societal scale, is no longer acceptable. Another symptom of this illness is that you might not even notice as a member of this system the system is broken if you're white, but you would if you're black.

    Personally, I think it could be understandable people didn't consider race when developing facial recognition technology, especially when we've only had mainstream awareness of this for under a decade and many people live in racially homogeneous or dominated cultures. However, I don't think it's acceptable for organizations, and the time when you can safely say you didn't understand biased learning data will be over soon. There are considerations you need to make scaling your tech from personal project to something the public will consume.

    Also, the day will come when computers can point out racist stuff better than the average human can now, albeit with a high false positive rate. I say this because it's relatively easy; even if you only count a subset of tweets talking about racism as not trolling, that's still a shockingly high number of meaningful things about the world many people aren't seeing.

  • Let's just say that if a system didn't recognize white people's faces, it probably wouldn't be viewed as "done".

    If it's not worth the trouble to see if your system works for non-whites, that's beyond lazy.

    Like the soap dispensers that don't work for dark skinned people.

  • It is racism to realize the negative aspects to other races, nevertheless pitch it as universally functional, and push the cost/pain to other races. Laziness is some of it, some of it is the realization that only other races bear the brunt of the cost.

  • If it doesn't occur to you that there are black people, or you are too lazy to test against then, then that is racism.

  • No, that is racism. It's not burning a cross on someone's lawn racism, but flat out ignoring a large swath of humanity is racism.