haruki zaemon

#diversity-equity-inclusion

These new tools let you see for yourself how biased AI image models are

  1. by Simon Harris
  2. Mar 26, 2023
  3. 1 min

the models tended to produce images of people that look white and male, especially when asked to depict people in positions of authority.

the models’ output overwhelmingly reflected stereotypical gender biases. Adding adjectives such as “compassionate,” “emotional,” or “sensitive” to a prompt describing a profession will more often make the AI model generate a woman instead of a man. In contrast, specifying the adjectives “stubborn,” “intellectual,” or “unreasonable” will in most cases lead to images of men.

In almost all of the representations of Native Americans, they were wearing traditional headdresses, which obviously isn’t the case in real life.

image-making AI systems tend to depict white nonbinary people as almost identical to each other but produce more variations in the way they depict nonbinary people of other ethnicities.