← Back to context

Comment by __bb

7 hours ago

Whenever I read about poisoning LLM inputs, I'm reminded of a bit in Neal Stephenson's Anathem, where businesses poisoned the the internet by publishing bad data, which only their tools could filter out:

> So crap filtering became important. Businesses were built around it. Some of those businesses came up with a clever plan to make more money: they poisoned the well. They began to put crap on the Reticulum [internet] deliberately, forcing people to use their products to filter that crap back out.

When I'm in a tinfoil hat sort of mood, it feels like this is not too far away.

EDIT: There's more in the book talking about "bad crap", which might be random gibberish, and "good crap" which is an almost perfect document with one important error in it.

Sounds in effect like what SEO / "trash article soup" companies did for Google et al the last decades.