← Back to context

Comment by jug

14 days ago

The problem here and with your comparison is that Gemini (the language model) wasn't creating black vikings because of political bias in the training, but due to how Google augmented the user prompts to force-include diversity. Behind the scenes, you were basically telling Gemini to always remember racial diversity even if you didn't in your prompt.

But if you were asking Gemini, vikings were white.

This was later rectified in an update once Google realized what mistake they had done, since it causes gross historical inaccuracies. But it wasn't rectified by doing anything to Gemini the language model. It did right all along.