Comment by rickyhatespeas
20 days ago
It will do harm to their own site considering it's now un-indexable on platforms used by hundreds of millions and growing. Anyone using this is just guaranteeing that their content will be lost to history at worst, or just inaccessible to most search engines/users at best. Congrats on beating the robots, now every time someone searches for your site they will be taken straight to competitors.
> now every time someone searches for your site they will be taken straight to competitors
There are non-LLM forms of distribution, including traditional web search and human word of mouth. For some niche websites, a reduction in LLM-search users could be considered a positive community filter. If LLM scraper bots agree to follow longstanding robots.txt protocols, they can join the community of civilized internet participants.
Exactly. Not every website needs to be at the top of SEO (or LLM-O?). Increasingly the niche web feels nicer and nicer as centralized platforms expand.
You can still fine-tune though. I often run User-Agent: *, Disallow: / with User-Agent: Googlebot, Allow: / because I just don't care for Yandex or baidu to crawl me for the 1 user/year they'll send (of course this depends on the region you're offering things to).
That other thing is only a more extreme form of the same thing for those who don't behave. And when there's a clear value proposition in letting OpenAI ingest your content you can just allow them to.
I'd rather no-one read it and die forgotten than help "usher in the AI era"
Then why bother with a website at all?
I put my own recipes up so when I am shopping I can get the ingredients list. Sometimes we pull it up while cooking on a tablet.
Indeed, it's like dumping rotting trash all over your garden and saying "Ha! Now Jehovah's witnesses won't come here anymore".
No, its like building a fence because your neighbors' dogs keep shitting in your yard and never clean it up.