They list a service for each address, so maybe you could block all the non-Route 53 IP addresses. Although that assumes they aren’t using the Route 53 IPs or unlisted IPs for scraping (the page warns it’s not a comprehensive list).
Regardless, it sucks that you have to deal with this. The fact that you’re a customer makes it all the more absurd.
Amazon does publish every IP address range used by AWS, so there is the nuclear option of blocking them all pre-emptively.
https://docs.aws.amazon.com/vpc/latest/userguide/aws-ip-rang...
I'd do that, but my DNS is via route 53. Blocking AWS would block my ability to manage DNS automatically as well as certificate issuance via DNS-01.
They list a service for each address, so maybe you could block all the non-Route 53 IP addresses. Although that assumes they aren’t using the Route 53 IPs or unlisted IPs for scraping (the page warns it’s not a comprehensive list).
Regardless, it sucks that you have to deal with this. The fact that you’re a customer makes it all the more absurd.
If you only block new inbound requests, it shouldn't impact your route 53 or DNS-01 usage.
It’ll most likely eventually help, as long as they don’t have an infinite address pool.
Do these bots use some client software (browser plugin, desktop app) that’s consuming unsuspecting users bandwidth for distributed crawling?
Monitor access logs for links that only crawlers can find.
Edit: oh, I got your point now.