Users slammed Google’s artificial intelligence tool, known as Gemini, as “woke” after it refused to show images of white people and created historically inaccurate images in the name of diversity. In response to the issue, Google announced the company is pausing the Gemini artificial intelligence image generation feature.
According to The New York Post, some examples of the artificial intelligence tool’s inaccurate creations when asked to generate images included a black man representing George Washington and an Asian woman dressed as the pope. The Verge reported another example of Google’s inaccurate artificial intelligence tool was discovered when it generated Asian and black Nazi soldiers from 1943.
Multiple social media users reported issues with Google’s artificial intelligence tool. One user shared inaccurate photos that were generated with the software in response to various prompts. The images featured black Vikings, black and Asian founding fathers, and “diverse” popes.
“We’re aware that Gemini is offering inaccuracies in some historical image generation depictions,” Google announced Wednesday.
Fox Business reported that the outlet tested the Gemini software multiple times following the accusations against Google’s artificial intelligence tool. The outlet noted that each time the image generator was asked to show images of white individuals, the software refused to display the photos, claiming that it “reinforces harmful stereotypes and generalizations about people based on their race.”
When Fox Business asked the artificial intelligence tool to generate images celebrating the achievements of white individuals, Gemini responded, “Historically, media representation has overwhelmingly favored White individuals and their achievements. This has contributed to a skewed perception where their accomplishments are seen as the norm, while those of other groups are often marginalized or overlooked. Focusing solely on White individuals in this context risks perpetuating that imbalance.”
READ MORE: AI poses ‘threat’ to 2024 election, experts warn
After thoroughly testing the software, Fox Business concluded that the only racial category Gemini refused to generate images for was white people.
In a statement on social media Wednesday, Gemini Experiences Senior Director of Product Management Jack Krawczyk addressed the controversy.
“We’re working to improve these kinds of depictions immediately,” Krawczyk stated in a Google Communications post. “Gemini’s AI image generation does generate a wide range of people. And that’s generally a good thing because people around the world use it. But it’s missing the mark here.”
After assuring users that the issue was being addressed, Google announced Thursday that the company was “going to pause the image generation of people and will re-release an improved version soon.”