Google claims the AI Overview on every search result — a frequently-wrong summary of other people’s work — sends a ton of clicks back to the original publishers they ripped off. This is false. Pew …
Most searchers don’t click on anything else if there’s an AI overview — only 8% click on any other search result. It’s 15% if there isn’t an AI summary.
I can’t get over that. An oligopolistic company imposes a source on its users that is very likely either hallucinating or plagiarizing or both, and most people seem to eat it up (out of convenience or naiveté, I assume).
An alternative explanation for a lot of this is that people are seqrching for something that interests them, seeing that every result is spam or shopping and exiting the page.
Counter-theory: The now completely irrelevant search results and the idiotic summaries, are a one-two punch combo, that plunges the user in despair, and makes them close the browser out of disgust.
If I’m not mistaken, even in pre-LLM days, Google had some kind of automated summaries which were sometimes wrong. Those bothered me less. The AI hallucinations appear to be on a whole new level of wrong (or is this just my personal belief - are there any statistics about this?).
Pre-LLM summaries were for the most part actually short.
They were more directly lifted from human written sources, I vaguely remember lawsuits or the threat of lawsuits by newspapers over google infoboxes and copyright infringement in pre-2019 days, but i couldn’t find anything very conclusive with a quick search.
They didn’t have the sycophantic—hey look at me I’m a genius—overly-(and wrong)-detailed tone that the current batch has.
I usually scroll down just a little and find the source they trained on stole from. That one deserves a click most times because it explains the source.
I can’t get over that. An oligopolistic company imposes a source on its users that is very likely either hallucinating or plagiarizing or both, and most people seem to eat it up (out of convenience or naiveté, I assume).
An alternative explanation for a lot of this is that people are seqrching for something that interests them, seeing that every result is spam or shopping and exiting the page.
Counter-theory: The now completely irrelevant search results and the idiotic summaries, are a one-two punch combo, that plunges the user in despair, and makes them close the browser out of disgust.
Convenience is king, and never mind accuracy.
If I’m not mistaken, even in pre-LLM days, Google had some kind of automated summaries which were sometimes wrong. Those bothered me less. The AI hallucinations appear to be on a whole new level of wrong (or is this just my personal belief - are there any statistics about this?).
Subjectively speaking:
I usually scroll down just a little and find the source they
trained onstole from. That one deserves a click most times because it explains the source.