Meta AI Image Generator Branded as 'Racist' After Failing to Generate Images of Interracial Couples

By

Meta's AI image generator, developed under the umbrella of Mark Zuckerberg's tech conglomerate, Meta, is facing severe backlash following allegations of racial bias.

According to The Daily Mail, users have reported that the AI fails to produce images depicting interracial couples, particularly those featuring an Asian man with a white woman, despite Zuckerberg's marriage to an Asian woman.

These testaments from users, then, have sparked outrage on social media, with critics branding the AI as "racist software made by racist engineers."

How Meta AI Image Generators Were Called Racist

Mia Satto, a reporter at The Verge, conducted multiple tests with the AI, attempting to generate images of mixed-race couples using prompts like "Asian man and Caucasian friend" or "Asian man and white wife."

She found that the AI consistently returned images of East Asian men and women, failing to produce images of interracial couples as requested. Changing the prompts to include platonic relationships also yielded no correct results, indicating a systemic issue with the AI's algorithm.

Meta launched its AI image generator last year, but its failure to accurately represent diverse relationships has raised serious concerns about racial bias in AI technology.

The company has yet to respond to these allegations.

Tags
Meta, Mark Zuckerberg

© 2024 VCPOST.com All rights reserved. Do not reproduce without permission.

Join the Conversation

Real Time Analytics