← Back to context

Comment by aftbit

17 days ago

I did read the article. I'm skeptical of the claim though. The author was careful to publish specific UAs for the bots, but then provided no extra information of the non-bot UAs.

>If you try to rate-limit them, they’ll just switch to other IPs all the time. If you try to block them by User Agent string, they’ll just switch to a non-bot UA string (no, really). This is literally a DDoS on the entire internet.

I'm also skeptical of the need for _anyone_ to access the edit history at 10 qps. You could put an nginx rule on those routes that just limits the edit history page to 0.5 qps per IP and 2 qps across all IPs, which would protect your site from both bad AI bots and dumb MediaWiki script kiddies at little impact.

>Oh, and of course, they don’t just crawl a page once and then move on. Oh, no, they come back every 6 hours because lol why not.

And caching would fix this too, especially for pages that are guaranteed not to change (e.g. an edit history diff page).

Don't get me wrong, I'm not unsympathetic to the author's plight, but I do think that the internet is an unsafe place full of bad actors, and a single bad actor can easily cause a lot of harm. I don't think throwing up your arms and complaining is that helpful. Instead, just apply the mitigations that have existed for this for at least 15 years, and move on with your life. Your visitors will be happier and the bots will get boned.