Comment by Tarragon
1 year ago
"It’s often assumed that African people arrived in Scotland in the 18th century, or even later. But in fact Africans were resident in Scotland much earlier, and in the early 16th century they were high-status members of the royal retinue."
https://www.nts.org.uk/stories/africans-at-the-court-of-jame...
an article about a small number of royally-associated africans in soctland in the 16th century does not justify an image generating AI producing large numbers of black people in pictures of scottish people in the 16th century.
The Scotland link in the grandparent post is to a picture of 2 people, 1 white, 1 black. 1 is not large numbers.
Look, Gemini is clearly doing some weird stuff. But going all "look what crazy thing it did" for this specific image is bullshit. Maybe it's a misunderstanding of Scotland in specific and the prevalence of black people in history in general, in which case in needs to be gently corrected.
Or it's performative histrionics
The argument I think you're making is "0.0001% of scottish people in the 16th century were black, so it's not realistic to criticize google if it produces historical images of scottish people where >25% of the individuals are black".
If you take the totality of examples given (beyond the scottish one), it's clear there's nothing specific about scotland here, the problem is systemic, and centered around class and race specifically. It feels to me- consistent with what many others have expressed- that Google specifically is applying query rewrites or other mechanisms to generate diversity where it historically did not exist, with a specific intent. That's why they shut down image generation a day after launching.