Comment by iLoveOncall
20 days ago
4.8M requests sounds huge, but if it's over 7 days and especially split amongst 30 websites, it's only a TPS of 0.26, not exactly very high or even abusive.
The fact that you choose to host 30 websites on the same instance is irrelevant, those AI bots scan websites, not servers.
This has been a recurring pattern I've seen in people complaining about AI bots crawling their website: huge number of requests but actually a low TPS once you dive a bit deeper.
It's never that smooth.
In fact 2M requests arrived on December 23rd from Claude alone for a single site.
Average 25qps is definitely an issue, these are all long tail dynamic pages.
Curious what your robots.txt looked like, if you have a link?