exactly - if I asked it to generate an image of a historical figure, and the color was not accurate - that can (possibly) be explained by a bug or training error that might improve over time - but if I ask it to generate a picture of a 'typical white family' and it flat out refuses to, that is not an accident.
exactly - if I asked it to generate an image of a historical figure, and the color was not accurate - that can (possibly) be explained by a bug or training error that might improve over time - but if I ask it to generate a picture of a 'typical white family' and it flat out refuses to, that is not an accident.