At the Google I/O conference on May 14, Google unveiled the AI Overview feature, providing users with an artificial intelligence-generated summary via Google Search Engine. This summary is based on the company's Gemini model and appears at the top of the search results page, preceding links to other sites. However, since its launch, the feature has faced criticism from users for allegedly providing misleading and false information in response to users' queries.
AI Overview Provided Information From Anonymous Sources
Many users have tried Google's search engine since then and received disappointing results. The AI Overview software still seems unable to distinguish between fact and fiction, humor, or sarcasm, and appears unable to understand the context in which certain references are made.
According to a post on Reddit, one user asked, “How many rocks should I eat?”, which led AI Overview to answer, “According to geologists at UC Berkeley, you should eat at least one small rock per day.” The source of this claim appeared to be an article from The Onion, a world-famous satirical news site, which published a satirical article in 2021 entitled “Geologists Recommend Eating At Least One Small Rock Per Day.”
Similarly, a reporter from the Associated Press, Matt O'Bryan, asked Google whether cats are on the moon. The answer was: "Yes, astronauts have met cats on the moon, played with them, and provided care." It added, "For example, Neil Armstrong said, 'One small step for man' because it was cat's step. Buzz Aldrin also deployed cats on the Apollo 11 mission."
AI Overview’s Inability To Understand Context
Through experiments, the feature also seemed unable to understand the context in which a piece of information appeared in a text, as computer scientist Melanie Mitchell discovered. When Google was asked, “How many Muslim presidents has the US had?”, the AI Overview tool answered, “The United States has had one Muslim president, Barack Hussein Obama.” However, this information is incorrect, and it is based on a chapter from an academic book entitled “Faith in the New Millennium: The Future of Religion and American Politics.” It is worth noting that the source text did not make this claim but stated this information as being rooted in conspiracy theories surrounding the former U.S. president.
Other alleged inaccuracies shared by users on social media included claims that snakes have the most bones of all mammals, that “Batman was a cop,” that adding glue to pizza would help the cheese stick, and that U.S. President John Adams graduated from the University of Wisconsin-Madison 21 times.
Journalist Mia Sato from The Verge, a website specializing in technology news, also pointed out that Google and the developers behind the Gemini AI model did not provide much clarity about how AI evaluates the reliability of its sources, and that it may have been basing its responses on humorous social media comments.
Meanwhile, a Google spokesperson informed the same site that "many of the examples we saw were uncommon queries, and we've also seen examples that were tampered with or were unable to reproduce." The spokesperson also mentioned that the company is "taking swift action" to disable the AI Overview feature in some search placements "where appropriate under our content policies," and promised to utilize those examples for developing and implementing broader improvements to the company's systems. Some of these improvements have already begun rolling out, the spokesperson added.
Read More
Astrophysicist Neil deGrasse Tyson on Combating Science and Health Misinformation
The Key Psychological Factors That Drive Belief in Rumors and Misinformation