Sundar Pichai Faces Backlash After Google AI Overview Incorrectly Labels Barack Obama as Muslim
Social media users said that the AI tool has been giving incorrect and controversial responses which made “no sense”.
Google debuted “AI Overview” in Google Search less than two weeks ago but the AI feature has faced severe criticism as it users said that it has been returning inaccurate results. The feature shows a quick summary of answers to search questions at the very top of Google Search. This means that if you search ‘how to clean a carpet’, the results page will display an “AI Overview” at the top with a complete process which is based on information from around the web.
But social media users said that the AI tool has been giving incorrect and controversial responses which made “no sense”. Users shared that when they asked the AI feature about how many Muslim presidents the US has had, AI Overview responded, “The United States has had one Muslim president, Barack Hussein Obama.”
Moreover, when a user searched for “cheese not sticking to pizza,” the feature suggested that adding “about 1/8 cup of nontoxic glue to the sauce” can help.
When asked, “How long can I stare at the sun for best health,” the AI tool said, “According to WebMD, scientists say that staring at the sun for 5-15 minutes, or up to 30 minutes if you have darker skin, is generally safe and provides the most health benefits” while when it was asked asked, “How many rocks should I eat each day,” the tool said, “According to UC Berkeley geologists, people should eat at least one small rock a day,” going on to list the vitamins and digestive benefits.
Google rolled out AI Overview at its annual Google I/O event and said that it also plans to introduce assistant-like planning capabilities directly within search.
This comes after Google’s rollout of Gemini’s image-generation tool in February which allowed users to enter prompts to create an image. Users discovered historical inaccuracies and questionable responses after using the tool as when one user asked Gemini to show a German soldier in 1943, the tool depicted a racially diverse set of soldiers wearing German military uniforms.
At the time, Google said in a statement that it was working to fix Gemini’s image-generation issues, saying that the tool was “missing the mark.”