Comment by msgodel

3 days ago

I'm generally very pro-robot (every web UA is a robot really IMO) but these scrapers are exceptionally poorly written and abusive.

Plenty of organizations managed to crawl the web for decades without knocking things over. There's no reason to behave this way.

It's not clear to me why they've continued to run them like this. It seems so childish and ignorant.

The bad scrapers would get blocked by the wall I mentioned. The ones intelligent enough to break the wall would simply take the easier way out and download the alternative data source.