Meta’s AI generator struggles to render image of Asian person dating white man or woman

Meta’s artificial intelligence-powered text-to-image generator refuses to consistently render photos of an Asian person dating a white man or woman — even though the company’s billionaire founder Mark Zuckerberg is married to the daughter of Chinese immigrants.

The AI tool called Imagine was released last December and seems to be suffering from a similar revisionist malady that afflicted Google’s AI image generator Gemini, which was paused after outcry over it spitting out wildly inaccurate depictions, including Native American popes and black Founding Fathers.

On Friday, The Post urged Imagine to create photos with prompts that read “Asian man and Caucasian friend,” “Asian man and white wife,” and “Asian woman and Caucasian husband.”

Meta’s AI image generator created this photo when prompted to produce one of an “Asian man with a white girlfriend.” Meta AI

The majority of the queries returned results that showed two Asian people.

The irony is that Meta’s founder and chief executive officer, Mark Zuckerberg, is married to Priscilla Chan, the daughter of Chinese immigrants. Getty Images

However, when the software was asked to create an image of an Asian woman with a black friend, it generated an accurate result. It did the same when asked to create romantic pairings between white and black people.

Imagine’s issues with rendering a white person dating an Asian person, first reported by The Verge, is head-scratching considering that Zuckerberg is married to Priscilla Chan. They have three kids.

“As we said when we launched these new features in September, this is new technology and it won’t always be perfect, which is the same for all generative AI systems,” a Meta spokesperson told The Post.

“Since we launched, we’ve constantly released updates and improvements to our models and we’re continuing to work on making them better.”

Critics have long contended that the algorithms used by AI image generators reflect the biases of the software engineers who program them.

This image was created with the prompt “Asian woman with white boyfriend.” Meta AI

The image generator comes with a disclaimer that warns some of its photos “may be inaccurate or inappropriate.”

Meta’s Imagine comes with a disclaimer that it may tend to produce images that are “inaccurate” and “inappropriate.” Meta AI
Meta’s Imagine did accurately create an image of an interracial couple — in this case a white man and a black woman. Meta AI

Imagine’s struggles come as the social media giant on Friday announced major changes to its policies on digitally created and altered media ahead of elections to test its ability to police deceptive content generated by new artificial intelligence technologies.

The image generator also correctly created a depiction of a black man and a white woman. Meta AI

Meta — which owns Facebook, Instagram WhatsApp, and Threads — will start applying “Made with AI” labels in May to AI-generated videos, images, and audio posted on its platforms, expanding a policy that previously addressed only a narrow slice of doctored videos, Vice President of Content Policy Monika Bickert said in a blog post.

Google said last year that AI labels are coming to YouTube and its other platforms.

With Post Wires

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button