Google criticized as AI Overview makes obvious errors, saying President Obama is Muslim and that it's safe to leave dogs in hot cars
It's been less than two weeks since Google debuted "AI Overviews" in Google Search, and public criticism has mounted after queries have returned nonsensical or inaccurate results within the AI feature — without any way to opt out.
AI Overviews show a quick summary of answers to search questions at the very top of Google Search: For example, if a user searches for the best way to clean leather boots, the results page may display an "AI Overview" at the top with a multi-step cleaning process, gleaned from information it synthesized from around the web.
But on social media, users have shared a wide range of screenshots showing the AI tool sharing controversial responses.
Google, Microsoft, OpenAI and other companies are at the helm of a generative AI arms race as companies in seemingly every industry race to add AI-powered chatbots and agents to avoid being left behind by competitors. The market is predicted to top $1 trillion in revenue within a decade.
Here are some examples of what went wrong with AI Overviews, according to screenshots shared by users.
When asked how many Muslim presidents the U.S. has had, AI Overviews responded, "The United States has had one Muslim president, Barack Hussein Obama."
When a user searched for "cheese not sticking to pizza," the feature suggested adding "about 1/8 cup of nontoxic glue to the sauce." Social media users found an 11-year-old Reddit comment that seemed to be the source.
For the query "Is it OK to leave a dog in a hot car," the tool at one point said, "Yes, it's always safe to leave a dog in a hot car," and went on to reference a fictional song by The Beatles about it being safe to leave dogs in hot cars.
Attribution can also be a problem for AI Overviews, especially when it comes to