Comment by krapp
8 years ago
True, but that doesn't really contradict the premise of my comment.
The assumption of ones own race being the default color for humans (rather than an arbitrary value) that leads to facial recognition software being trained on a data set so narrow that the resulting software can't even recognize skin colors that diverge from that norm is as much about implicit, albeit unintentional, racial bias as it is the particular technical problems involved and the need for engineers to cut costs and meet deadlines.
It's not entirely racist, but racism is a component.
No comments yet
Contribute on Hacker News ↗