Brin’s “We definitely messed up.”, at an AI “hackathon” event on 2 March, followed a slew of social media posts showing Gemini’s image generation tool depicting a variety of historical figures – including popes, founding fathers of the US and, most excruciatingly, German second world war soldiers – as people of colour.
It’s not just historical. I’m a white male and I prompted Gemini to create images for me if a middle aged white man building a Lego set etc. Only one image was a white male and two of the others wrecan Indian and a Black male. Why when I asked for a white male. It was an image I wanted to share to my family. Why would Gemini go off the prompt? I did not ask for diversity, nor was it expected for that purpose, and I got no other options for images which I could consider so it was a fail.
cupcakezealot@lemmy.blahaj.zone 8 months ago
was it really offensive or was it just “target selling pride clothes during pride month” offensive?
helenslunch@feddit.nl 8 months ago
I don’t know that “offensive” is the right word. More just “shitty” and “lazy”.
Like they took the time out to teach it “diversity” but couldn’t bother to train it past “diversity = people who are not white” or to acknowledge when the user is asking specifically for a white person or a different region or time period.
Faydaikin@beehaw.org 8 months ago
I, for one, welcome Japanese George Washington,Indian Hitler and Inuit Ghandi to our historical database.
CanadaPlus@lemmy.sdf.org 8 months ago
I think the lesson here is that political correctness isn’t very machine learnable. Human history and modern social concerns are very complex in a precise way and really should be addressed with conventional rules and algorithms. Or manually, but that’s obviously not scalable at all.