Comment by jsheard

19 days ago

That's right, getting DDOSed is a skill issue. Just have infinite capacity.

DDOS is different from crashing.

And I doubt Facebook implemented something that actually saturates the network, usually a scraper implements a limit on concurrent connections and often also a delay between connections (e.g. max 10 concurrent, 100ms delay).

Chances are the website operator implemented a webserver with terrible RAM efficiency that runs out of RAM and crashes after 10 concurrent requests, or that saturates the CPU from simple requests, or something like that.

  • You can doubt all you want, but none of us really know, so maybe you could consider interpreting people's posts a bit more generously in 2025.

  • I've seen concurrency in excess of 500 from Metas crawlers to a single site. That site had just moved all their images so all the requests hit the "pretty url" rewrite into a slow dynamic request handler. It did not go very well.