Comment by graeme
2 months ago
Very good article. Not clear to me why Google has let parasite SEO become so successful. Possibly they are starved of human generated content kept to a certain quality level. But it's very strange to see sites leveraging a legacy brand to expand far beyond their expertise. Forbes is the most prominent example.
On why Google allows bad quality results nowadays:
https://news.ycombinator.com/item?id=40133976
One of the commenters on wheresyoured seemed insightful: "wonder if organic search results being worse generates more ad clicks, as the ads are more likely to be more useful than the actual search results".
Someone once described the state of mobile gaming on Android like this. Games that are good make less money. Games that are just good enough to get you to open them but are also just shitty enough that when you hit an ad in-game, you click on it and leave, make more money.
8 replies →
Google team has long given up its user’s needs for incremental revenue goals. See the 10 ads that come before any result
The problem is lack of credible competition. There is no quality wise matching alternative for Google.
Nobody has come up with a scalable metric for determining quality that can't be appropriated by SEO. Pagerank was one of the best for awhile (number sites that link to your site, weighted by their rank). Whether it be clicks, time on page, percentage of people who clicked onto the page then ended the session, etc it all gets gamed.
Like it or not, it's what the people want. The "trashy" movies, books, music, etc. all sell like wildfire, why do most people on hn think that the internet should be any different?
> Nobody has come up with a scalable metric for determining quality that can't be appropriated by SEO
Nor will it ever happen, at least as long as search is a Google monoculture. One effective player in the search space means that everyone sets their sight on the same target. Which naturally leads to SEO being an extremely one-sided arms race not in favor of Google - "good content" is hard to quantify, and whatever proxy (or combinations thereof) Google uses to estimate that quality will be ruthlessly uncovered, reverse engineered and exploited to its last inch by an unyielding, unending horde of "SEO optimizers".
The only way that Google maintains search quality is if it properly accounts for the fact that websites will adopt the path of least resistance, i.e. put in the least amount of effort to optimize only specifically the things that Google measures. Which means that the heuristics and algorithms Google uses to determine search rankings must always be a moving target whose criteria needs to be vigilantly updated and constantly curated by people. Any attempt to fully automate that curation will result in the cyber equivalent of natural selection - SEO-optimizing websites adapting by automating the production of content that hits just the right buttons while still being royally useless to actual visitors.
Pagerank worked as long as no one knew what the metric was and the old, hyperlinked web existed.
I think today we can use LLMs to decide what websites are shit. The wheel is turning. SEO artists will have to provide actually useful, non-spammy content. If Google doesn't do it, some uBlock like service will implement it on user side. Or we'll just use chatGPT with search and not see the cesspool at all. You can edit its system prompt to avoid shit in search results.
Your comment assumes both that current LLMs can scale to replace Google which seems unlikely from. both a business and compute perspective.
And if they do, you'll get maybe a decade out of them before they succumb to the same problems as Google have.
Since there is no competition and people will keep using Google whatever happens, might as well push the ad-filled garbage site than the ad-free handwritten blogpost. The former probably makes them more money, everything else humanity holds dear be damned.
Complacency? Google has such a dominance in search that their name is used as a verb. Combine that with their culture of automating everything to an extreme degree. And the end result seems to be: search that is just good enough that people keep using it and requires little human fine tuning/curation making it cheap at scale.
Not to mention how flawed the current search tool really is. If you search for something, page 1 shows results from page 7 to some infinite number. But click on that large number, and you find out that the last page was page 3.
That was there for many years
> Not clear to me why Google has let parasite SEO become so successful
This was in response to the millions of SEOs flooding the SERPs with ever increasing amounts of low quality / incorrect / harmful AI generated content. Google didn't know how to keep the SERPs clean except over index on authority. The highly authoritative websites abused that to shill CBD oil, air fryers, mattresses, etc.
It has to do with these old brands exploiting domain authority, plus buying tons of backlinks. Investopedia.com is another example of this. Google assigns too much weight to authority domains. Google doesn't actually penalize paid backlinks for old domains, I think.
Because Google makes money through all this. These move ads. That's all they care about at this stage. I had stated a few years back Google is dying. It will take a while and it's going to be painful but we will get over this soon. 20 years is a good run.
google search result can be shit and they will still make tons of money from 3rd party/publisher ads and youtube, cloud, gmail, atc.