Comment by imtringued
8 days ago
The AI crawlers have tens of thousands of IPs and some of them use something akin to a residential botnet.
If they notice that they are getting rate limited or IP blocked, they will use each IP only once. This means that IP based rate limiting simply doesn't work.
The proof of work algorithm in Anubis creates an initial investment that is amortized over multiple requests. If you decide to throw the proof away, you will waste more energy, but if you don't, you can be identified and rate limited.
The automated agent. An never get around this, since running the code is playing by the rules. The goal of the automated agent is to ignore the rules.
No comments yet
Contribute on Hacker News ↗