this post was submitted on 12 Mar 2024
52 points (100.0% liked)

Technology

37696 readers
482 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Brin’s “We definitely messed up.”, at an AI “hackathon” event on 2 March, followed a slew of social media posts showing Gemini’s image generation tool depicting a variety of historical figures – including popes, founding fathers of the US and, most excruciatingly, German second world war soldiers – as people of colour.

you are viewing a single comment's thread
view the rest of the comments
[–] GadgeteerZA@beehaw.org 16 points 7 months ago (17 children)

It's not just historical. I'm a white male and I prompted Gemini to create images for me if a middle aged white man building a Lego set etc. Only one image was a white male and two of the others wrecan Indian and a Black male. Why when I asked for a white male. It was an image I wanted to share to my family. Why would Gemini go off the prompt? I did not ask for diversity, nor was it expected for that purpose, and I got no other options for images which I could consider so it was a fail.

[–] yiliu@informis.land 6 points 7 months ago

A while back, one of the image generation AIs (midjourney?) caught flack because the majority of the images it generated only contained white people. Like...over 90% of all images. And worse, if you asked for a "pretty girl" it generated uniformly white girls, but if you asked for an "ugly girl" you got a more racially-diverse sample. Wince.

But then there reaction was to just literally tack "...but diverse!" on the end of prompts or something. They literally just inserted stuff into the text of the prompt. This solved the immediate problem, and the resulting images were definitely more diverse...but it led straight to the sort of problems that Google is running into now.

load more comments (16 replies)