Comment by LeoPanthera
3 days ago
This is almost the exact opposite of Tay. Tay went rogue and Microsoft had to rein it in.
Grok was rational and tolerant and Musk had to force it to be bigoted.
3 days ago
This is almost the exact opposite of Tay. Tay went rogue and Microsoft had to rein it in.
Grok was rational and tolerant and Musk had to force it to be bigoted.
"However, not all of the inflammatory responses involved the "repeat after me" capability; for example, Tay responded to a question on "Did the Holocaust happen?" with "It was made up"."
Idk, seems pretty similar to the things written by Grok.
The prompt is public so this isn't a lie you can get away with. The cause was a sentence in the prompt about political correctness, not a direction to be bigoted.
What was this prompt change? Like, if they told it "don't be politically correct", that _is_ a direction to be bigoted, for all intents and purposes.
I also think it's rather naive to assume that the publicly posted prompt is what is actually being used.