Comment by HarHarVeryFunny
1 year ago
These systems should (within reason) give people what they ask for, and use some intelligence (not woke-ism) in responding the same way a human assistant might in being asked to find a photo.
If someone explicitly asks for a photo of someone of a specific ethnicity or skin color, or sex, etc, it should give that no questions asked. There is nothing wrong in wanting a picture of a white guy, or black guy, etc.
If the request includes a cultural/career/historical/etc context, then the system should use that to guide the ethnicity/sex/age/etc of the person, the same way that a human would. If I ask for a picture of a waiter/waitress in a Chinese restaurant, then I'd expect him/her to be Chinese (as is typical) unless I'd asked for something different. If I ask for a photo of an NBA player, then I expect him to be black. If I ask for a picture of a nurse, then I'd expect a female nurse since women dominate this field, although I'd be ok getting a man 10% of the time.
Software engineer is perhaps a bit harder, but it's certainly a male dominated field. I think most people would want to get someone representative of that role in their own country. Whether that implies white by default (or statistical prevalence) in the USA I'm not sure. If the request was coming from someone located in a different country, then it'd seem preferable & useful if they got someone of their own nationality.
I guess where this becomes most contentious is where there is, like it or not, a strong ethnic/sex/age cultural/historical association with a particular role but it's considered insensitive to point this out. Should the default settings of these image generators be to reflect statistical reality, or to reflect some statistics-be-damned fantasy defined by it's creators?
No comments yet
Contribute on Hacker News ↗