Newsnews

Google Pauses AI Tool Gemini’s Ability To Generate Images Of People After Historical Inaccuracies

google-pauses-ai-tool-geminis-ability-to-generate-images-of-people-after-historical-inaccuracies

Google has announced that it is temporarily suspending the ability of Gemini, its flagship generative AI suite of models, to generate images of people. This decision comes as the company works on updating the model to improve the historical accuracy of its outputs.

Key Takeaway

Google has temporarily suspended its AI tool Gemini’s ability to generate images of people as it addresses historical inaccuracies in the outputs.

Addressing Recent Issues

In a post on the social media platform X, Google stated that it is working to address “recent issues” related to historical inaccuracies. The company described this move as a “pause” on generating images of people and mentioned that it will re-release an improved version soon.

Public Reaction

Google launched the Gemini image generation tool earlier this month. However, examples of it generating incongruous images of historical figures have surfaced on social media in recent days. These include images of the U.S. Founding Fathers depicted as American Indian, black, or Asian, leading to criticism and ridicule.

Paris-based venture capitalist Michael Jackson joined the criticism, branding Google’s AI as “a nonsensical DEI parody” in a post on LinkedIn.

Google’s Response

In a statement on social media, Google confirmed that it was “aware” the AI was producing “inaccuracies in some historical image generation depictions” and stated that it is working to improve these depictions immediately. The company acknowledged that while Gemini’s AI image generation produces a wide range of people, it has “missed the mark” in this instance.

Challenges Faced by Generative AI Tools

Generative AI tools produce outputs based on training data and other parameters. They have often faced criticism for producing biased outputs, such as stereotypical imagery or misclassifications. Google’s previous AI image classification tool faced backlash in 2015 when it misclassified black men as gorillas. Despite promises to fix the issue, the reported ‘fix’ was a workaround, with Google simply blocking the tech from recognizing gorillas at all.

Leave a Reply

Your email address will not be published. Required fields are marked *