Meta’s Image Generator Struggles with Representation of Mixed-Race Couples and Friends
Meta, the social media giant behind Instagram, is facing criticism for its image generator’s inability to accurately represent mixed-race couples and friendships. In a recent report by The Verge, writer Mia Sato shared her frustrating experiences trying to create realistic images of a mixed-race couple using Meta’s generator.
Sato attempted to generate images of an East Asian man with a Caucasian woman using prompts like “Asian man and white wife” or “Asian man and Caucasian woman on wedding day.” However, the generator repeatedly delivered images of two Asians instead. Only once did it return an accurate image, but even then, it portrayed an older man with a young, light-skinned Asian woman.
This issue with Meta’s image generator is a stark contrast to Google Gemini’s recent controversy. Gemini, a chatbot released by Google, was found to be delivering historically inaccurate and pandering images, such as female Catholic popes, Asian Nazi soldiers, and black U.S. founding fathers. Google promptly suspended Gemini’s ability to generate people and issued a public apology.
Sato described Meta’s results as “egregious” and criticized the AI system for imprisoning imagination within society’s biases. Despite tweaking the text prompts, the generator continued to reinforce stereotypes and biases. Even when prompted to depict platonic relationships, it consistently offered images of Asian pairs rather than diverse representations.
Furthermore, Meta’s bot seemed to reinforce biases around skin tone. When prompted with “Asian woman,” it returned East Asian-looking faces with light complexions. It also added culturally-specific clothing, such as a bindi or a sari, even when not requested. The bot also blended culturally different areas of Asia together, as seen in an image of a couple wearing a mix of a Chinese dress and a Japanese garment.
The AI system also appeared to uphold biases around age by repeatedly generating images of older Asian men with young-looking Asian women. These results do not accurately reflect the diverse Asian population or the reality of mixed-race marriages.
AI technology is a reflection of the data it is trained on and the biases of its creators and trainers. Many Twitter users pointed out the flaws in Meta’s image generator, emphasizing the importance of addressing biases in AI systems.
It is worth noting that mixed-race marriage is not a new phenomenon, with 17% of marriages in the U.S. being between individuals of different races or ethnicities. Approximately three in ten Asian newlyweds have a spouse of a different race or ethnicity. Despite this, Western media often portrays Asians as a homogenous group, disregarding their diversity and cultural differences.
Meta’s struggles with accurately representing mixed-race couples and friends highlight the need for improved diversity and inclusivity in AI systems. As AI continues to shape our digital landscape, it is crucial that it reflects the reality of our diverse society and avoids perpetuating harmful stereotypes and biases.