← Back to context

Comment by bityard

1 day ago

If you're a botnet operator of a botnet that normally scraped a few dozen pages per second and then noticed a site suddenly taking multiple seconds per page, that's at least an order of magnitude (or two) decrease in performance. If you care at all about your efficiency, you step in and put that site on your blacklist.

Even if the bot owner doesn't watch (or care) about about their crawling metrics, at least the botnot is not DDoSing the site in the meantime.

This is essentially a client-side tarpit, which are actually pretty effective against all forms of bot traffic while not impacting legitimate users very much if at all.

A tarpit is selective. You throw bad clients in the tarpit.

This is something you throw everyone through. both your abusive clients (running on stolen or datacenter hardware) and your real clients (running on battery-powered laptops and phones). More like a tar-checkpoint.