← Back to context

Comment by ifyoubuildit

1 year ago

How much of this do they do to their search results?

Google "white family" and count how many non-white families show up in the image results. 8 out of the first 32 images didn't match, for me.

Now, sometimes showing you things slightly outside of your intended search window can be helpful; maybe you didn't really know what you were searching for, right? Whose to say a nudge in a certain direction is a bad thing.

Extrapolate to every sensitive topic.

EDIT: for completeness, google "black family" and count the results. I guess for this term, Google believes a nudge is unnecessary.

  • google image search "Chief Diversity Officer" and you'll see an extremely un-diverse group of people.

  • It's true, if you look at Bing and Yahoo you can see the exact same behavior!

    • > This is conspiratorial thinking at its finest.

      Sounds crazy right? I half don't believe it myself, except we're discussing this exact built-in bias with their image generation algorithm.

      > No. If you look at any black families in the search results, you'll see that it's keying off the term "white".

      Obviously they are keying off alternate meanings of "white" when you use white as a race. The point is, you cannot use white as a race in searches.

      Google any other "<race> family", and you get exactly what you expect. Black family, asian family, indian family, native american family. Why is white not a valid race query? Actually, just typing that out makes me cringe a bit, because searching for anything "white" is obviously considered racist today. But here we are, white things are racist, and hence the issues with Gemini.

      You could argue that white is an ambiguous term, while asian or indian are less-so, but Google knows what they're doing. Search for "white skinned family" or similar and you actually get even fewer white families.

>How much of this do they do to their search results?

This is what I'm wondering too.

I am aware that there have been kerfuffles in the past about Googe Image Searching for `white people` pulling up non-white pictures, but thought that that was because so much of the source material doesn't specify `white` for white people because it's assumed to be the default. I assumed that that was happening again when first hearing of the strange Gemini results, until seeing the evidence of explicit prompt injection and clearly ahistorical/nonsensical results.