← Back to context

Comment by kuschku

19 hours ago

Since when is 10r/s flooding?

That barely registers as a blip even if you're hosting your site on a single server.

In our case this was very heavy specialized endpoint and because each request used different set of parameters could not benefit from caching (actually in this case it thrashed caches with useless entries).

This resulted in upscale. When handling such bot cost more than rest of the users and bots, that's an issue. Especially for our customers with smaller traffic.

This request rate varied from site to site, but it ranged from half to 75% of whole traffic and was basically saturating many servers for days if not blocked.

That depends on what you're hosting. Good luck if it's e.g. a web interface for a bunch of git repositories with a long history. You can't cache effectively because there's too many pages and generating each page isn't cheap.

If you're serving static pages through nginx or something, then 10/sec is nothing. But if you're running python code to generate every page, it can add up fast.