Comment by Saris
16 days ago
What I don't get is why they need to crawl so aggressively, I have a site with content that doesn't change often (company website) with a few hundred pages total. But the same AI bot will scan the entire site multiple times per day, like somehow all the content is going to suddenly change now after it hasn't for months.
That cannot be an efficient use of their money, maybe they used their own AI to write the scraper code.
The post mentions that the bots were crawling all the wiki diffs. I think that might be useful to see how text evolves and changes over time. Possibly how it improves over time, and what those improvements are.
I guess they are hoping that there will be small changes to your website that it can learn from.
Maybe trying to guess who wrote who?