Comment by AlienRobot
20 hours ago
I think the idea is good if it could actually curb bot traffic that currently plagues the Internet.
However, a lot of recent bot traffic are sophisticated scrappers called "LLM's." You can tell claude to "research X from this www.example.com" and will automatically scrape it and summarize it, something that a LLM is perfect for. Gemini tends to share links instead, presumably because most of Google's revenue comes from ads served on those websites, so if it completely killed the traffic to those websites it would just make less money. Incidentally, I wonder if Claude/Gemini use an search engine-like "index" of all websites or it refuses to cache anything to always fetch "fresh" data.
If this is employed, I don't think the web is only going to be gatekept to Google devices. I think it will also be gatekept to Google's AI's.
Google would be able to display a captcha that no LLM could defeat, and then just let its own LLM pass through.
The same could be said about its other bots, such as the web crawler. Google's bot could crawl webpages that no other crawler would ever be able to simply because it has free pass to captcha-gated GETs. Although the same could be true already today.
Their product page is full of info about how this works with "agentic" cruft. They're still permitting your regular old scrapers and bots for as long as they like you. Hope you're not thinking of running an independent system instead of a large cloud platform!