← Back to context

Comment by dougmwne

1 year ago

That’s just the thing, it literally changes your prompt instructions to randomize gender and ethnicity even when you specify. If you do specify, it might flag you as being inappropriate and give a refusal. This has been a common strategy for image generators to try to combat implicit biases in the training data (more internet images of nurses are female therefore asking for “nurse” will always yield a female nurse unless the system appends randomly “male” nurse), but Google appears to have gone way overboard to where is scolds you if you ask for a female nurse since you are being biased and should know men can also be nurses.

I’m in no way defending Gemini. But if I am explicit about race for ChatGPT, it respects the prompt.

I have two coworkers in a private Slack and we are always generating crazy memes with ChatGPT. If I specify a bald Black guy (me), a white woman and a Filipino guy, it gets it right.

I tried some of the same prompts that Gemini refused to render with ChatGPT or forced “diversity” on, ChatGPT did it correctly.