← Back to context Comment by a-biad 1 day ago I am bit confused about the context. What is exactly the point of exposing fake data to webcrawlers? 2 comments a-biad Reply whitten 20 hours ago penalizing the web spider for scraping their site gosub100 19 hours ago They crawl for data, usually to train a model. Poisoning the models training data makes it less useful and therefore less valuable
gosub100 19 hours ago They crawl for data, usually to train a model. Poisoning the models training data makes it less useful and therefore less valuable
penalizing the web spider for scraping their site
They crawl for data, usually to train a model. Poisoning the models training data makes it less useful and therefore less valuable