Comment by SilverBirch
1 year ago
But this is a misunderstanding of what the AI does. When you say "Generate me diverse senators from the 1800s" it doesn't go to wikipedia, find out the names of US Senators from the 1800s, look up some pictures of those people and generate new images based on those images. So even if it generated 100% white senators it still wouldn't be generating historically accurate images. It simply is not a tool that can do what you're asking for.
I'm not arguing from a technical perspective, but from a logical one as a user of these tools.
If I ask it to generate an image of a "person", surely it understands what I mean based on its training data. So the output should fit the description of "person", but it should be free to choose every other detail _also_ based on its training data. So it should make a decision about the person's sex, skin color, hair color, eye color, etc., just as it should decide about the background, and anything else in the image. That is, when faced with ambiguity, it should make a _plausible_ decision.
But it _definitely_ shouldn't show me a person with purple skin color and no eyes, because that's not based in reality[1], unless I specifically ask it to.
If the technology can't give us these assurances, then it's clearly an issue that should be resolved. I'm not an AI engineer, so it's out of my wheelhouse to say how.
[1]: Or, at the very least, there have been very few people that match that description, so there should be a very small chance for it to produce such output.