Link: Google explains AI Overviews' viral mistakes, defends accuracy
Instead, when AI Overviews get it wrong, Google says common issues are “misinterpreting queries, misinterpreting a nuance of language on the web, or not having a lot of great information available.” Google highlighted some of the viral instances. In the case of “How many rocks should I eat,” Google acknowledges that it doesn’t handle satirical content well. It also points to how The Onion article was “republished on a geological software provider’s website.” So when someone put that question into Search, an AI Overview appeared that faithfully linked to one of the only websites that tackled the question. The other case Google highlighted was “using glue to get cheese to stick to pizza” and over-indexing on forums (Reddit in that case) as a source of reliable first-hand knowledge. Finally: In a small number of cases, we have seen AI Overviews misinterpret language on webpages and present inaccurate information. We worked quickly to address these issues, either through improvements to our algorithms or through established processes to remove responses that don’t comply with our policies. In terms of next steps, Google has “limited the inclusion of satire and humor content” as part of “better detection mechanisms for nonsensical queries.” #
--
Yoooo, this is a quick note on a link that made me go, WTF? Find all past links here.
Member discussion