Google’s AI Overviews create quirky, unreliable outcomes, the company acknowledges.

TLDR:

Google admitted that its AI Overviews tool can generate inaccurate and odd results, including potentially dangerous information. The company is scaling back on the feature and making improvements to address the issues.

Google recently launched its AI Overviews tool, which uses artificial intelligence to respond to search queries. However, the company acknowledged that the technology produces some odd and erroneous overviews. Examples include suggesting using glue to get cheese to stick to pizza or drinking urine to pass kidney stones quickly. Some search results were potentially dangerous, such as providing incomplete information about edible mushrooms or promoting a debunked conspiracy theory about a Muslim president of the U.S.

Google’s head of search, Liz Reid, stated that the company is scaling back the AI Overviews feature while continuing to make improvements. The tool sometimes generated unhelpful responses due to nonsensical queries or misinterpretation of webpage language. Reid mentioned that Google is adding restrictions for queries where AI Overviews were not proving to be helpful and limiting the use of user-generated content in responses that could offer misleading advice.

Overall, Google is working to address the issues with its AI Overviews tool to ensure more accurate and reliable search results for users. The company is taking steps to improve the technology and provide better overall user experience.