← Back to context

Comment by brookst

19 days ago

The latter is clever but unlikely to do any harm. These companies spend a fortune on pre-training efforts and doubtlessly have filters to remove garbage text. There are enough SEO spam pages that just list nonsense words that they would have to.

1. It is a moral victory: at least they won't use your own text.

2. As a sibling proposes, this is probably going to become an perpetual arms race (even if a very small one in volume) between tech-savvy content creators of many kinds and AI companies scrapers.

Obfuscators can evolve alongside other LLM arms races.

  • Yes, but with an attacker having advantage because it directly improves their own product even in the absence of this specific motivation for obfuscation: any Completely Automated Public Turing test to tell Computers and Humans Apart can be used to improve the output of an AI by requiring the AI to pass that test.

    And indeed, this has been part of the training process for at least some of OpenAI models before most people had heard of them.

Seems like an effective technique for preventing your content from being included in the training data then!

It will do harm to their own site considering it's now un-indexable on platforms used by hundreds of millions and growing. Anyone using this is just guaranteeing that their content will be lost to history at worst, or just inaccessible to most search engines/users at best. Congrats on beating the robots, now every time someone searches for your site they will be taken straight to competitors.

  • > now every time someone searches for your site they will be taken straight to competitors

    There are non-LLM forms of distribution, including traditional web search and human word of mouth. For some niche websites, a reduction in LLM-search users could be considered a positive community filter. If LLM scraper bots agree to follow longstanding robots.txt protocols, they can join the community of civilized internet participants.

    • Exactly. Not every website needs to be at the top of SEO (or LLM-O?). Increasingly the niche web feels nicer and nicer as centralized platforms expand.

  • You can still fine-tune though. I often run User-Agent: *, Disallow: / with User-Agent: Googlebot, Allow: / because I just don't care for Yandex or baidu to crawl me for the 1 user/year they'll send (of course this depends on the region you're offering things to).

    That other thing is only a more extreme form of the same thing for those who don't behave. And when there's a clear value proposition in letting OpenAI ingest your content you can just allow them to.

  • Indeed, it's like dumping rotting trash all over your garden and saying "Ha! Now Jehovah's witnesses won't come here anymore".

    • No, its like building a fence because your neighbors' dogs keep shitting in your yard and never clean it up.

Rather than garbage, perhaps just serve up something irrelevant and banal? Or splice sentences from various random project Gutenberg books? And add in a tarpit for good measure.

At least in the end it gives the programmer one last hoorah before the AI makes us irrelevant :)