Comment by dlenski
3 months ago
> Crawlers can still start an arbitrary number of parallel crawls, but each one costs to start and needs to stay below some rate limit.
This is a nice explanation. It's much clearer than anything I've seen offered by Anubis’s authors, in terms of why or how it could be effective at preventing a site from being ravaged by hordes of ill-behaved bots.
No comments yet
Contribute on Hacker News ↗