This might sound familiar to you: Google rushes a half-baked AI tool to the internet, then scrambles to fix it when it goes wrong. We’ve seen this happen several times ever since Bard hit the market last year. Well, the curse has been passed down to Google’s latest AI innovation. Google is rushing to fix the odd results given by AI Overviews.
Google’s AI Overviews were met with criticism when they first launched. They prove to be detrimental to the news industry, so many users have found how to disable the feature. Here’s how to disable AI Overviews in case you’re curious. Removing the feature is pretty simple, but the method will change depending on which browser you’re using.
Google is rushing to fix the odd answers given by AI Overviews
This new feature is being panned for another reason. Numerous users across the web are posting images of AI Overviews giving strange answers to their queries. Some of the most famous examples are Google telling people to eat glue on their pizza, telling them to eat rocks, and telling them that cockroaches can live in… problematic areas of the male anatomy.
Wasn’t the point of scraping data from legitimate sources to get accurate information? We have to remember that Google is letting people avoid navigating to websites for its AI overviews. Websites are losing ad revenue for this tool.
A Google spokesperson, Meghann Farnsworth, told The Verge that the company is “taking swift action” to rein in the responses. They also said that “Many of the examples we’ve seen have been uncommon queries, and we’ve also seen examples that were doctored or that we couldn’t reproduce.”
We have to give the company some credit and say that some of the queries were intentionally ridiculous. In the one where Gemini suggested to eat rocks, the query was “how many rocks shall I eat”. However, that’s not the case for all of them. In the example about putting glue on pizza, the query was “cheese not sticking to pizza”. That’s a legitimate question.
Google is removing bad responses
As a response, Google has been manually removing the odd answers to the queries. We’re not sure how that’s going to take, but it’s a step that the company has taken before. As Google rushes to clean up its mess, we’re sure to get more odd responses. Let’s just hope that no one will be hurt by any of Google’s odd responses.
2024-05-28 15:04:53