Comment by 10000truths
2 months ago
> Nobody has come up with a scalable metric for determining quality that can't be appropriated by SEO
Nor will it ever happen, at least as long as search is a Google monoculture. One effective player in the search space means that everyone sets their sight on the same target. Which naturally leads to SEO being an extremely one-sided arms race not in favor of Google - "good content" is hard to quantify, and whatever proxy (or combinations thereof) Google uses to estimate that quality will be ruthlessly uncovered, reverse engineered and exploited to its last inch by an unyielding, unending horde of "SEO optimizers".
The only way that Google maintains search quality is if it properly accounts for the fact that websites will adopt the path of least resistance, i.e. put in the least amount of effort to optimize only specifically the things that Google measures. Which means that the heuristics and algorithms Google uses to determine search rankings must always be a moving target whose criteria needs to be vigilantly updated and constantly curated by people. Any attempt to fully automate that curation will result in the cyber equivalent of natural selection - SEO-optimizing websites adapting by automating the production of content that hits just the right buttons while still being royally useless to actual visitors.
No comments yet
Contribute on Hacker News ↗