Comment by s0meON3 19 hours ago What about using zip bombs?https://idiallo.com/blog/zipbomb-protection 5 comments s0meON3 Reply lavela 19 hours ago "Gzip only provides a compression ratio of a little over 1000: If I want a file that expands to 100 GB, I’ve got to serve a 100 MB asset. Worse, when I tried it, the bots just shrugged it off, with some even coming back for more."https://maurycyz.com/misc/the_cost_of_trash/#:~:text=throw%2... LunaSea 17 hours ago You could try different compression methods supported by browsers like brotli.Otherwise you can also chain compression methods like: "Content-Encoding: gzip gzip". kalkin 4 hours ago Ah cool that site's robots.txt is still broken, just like it was when it first came up on HN... renegat0x0 18 hours ago Even I, who does not know much, implemented a workaround.I have a web crawler and I have both scraping byte limit and timeout, so zip bombs dont bother me much.https://github.com/rumca-js/crawler-buddyI think garbage blabber would be more effective.
lavela 19 hours ago "Gzip only provides a compression ratio of a little over 1000: If I want a file that expands to 100 GB, I’ve got to serve a 100 MB asset. Worse, when I tried it, the bots just shrugged it off, with some even coming back for more."https://maurycyz.com/misc/the_cost_of_trash/#:~:text=throw%2... LunaSea 17 hours ago You could try different compression methods supported by browsers like brotli.Otherwise you can also chain compression methods like: "Content-Encoding: gzip gzip". kalkin 4 hours ago Ah cool that site's robots.txt is still broken, just like it was when it first came up on HN...
LunaSea 17 hours ago You could try different compression methods supported by browsers like brotli.Otherwise you can also chain compression methods like: "Content-Encoding: gzip gzip".
kalkin 4 hours ago Ah cool that site's robots.txt is still broken, just like it was when it first came up on HN...
renegat0x0 18 hours ago Even I, who does not know much, implemented a workaround.I have a web crawler and I have both scraping byte limit and timeout, so zip bombs dont bother me much.https://github.com/rumca-js/crawler-buddyI think garbage blabber would be more effective.
"Gzip only provides a compression ratio of a little over 1000: If I want a file that expands to 100 GB, I’ve got to serve a 100 MB asset. Worse, when I tried it, the bots just shrugged it off, with some even coming back for more."
https://maurycyz.com/misc/the_cost_of_trash/#:~:text=throw%2...
You could try different compression methods supported by browsers like brotli.
Otherwise you can also chain compression methods like: "Content-Encoding: gzip gzip".
Ah cool that site's robots.txt is still broken, just like it was when it first came up on HN...
Even I, who does not know much, implemented a workaround.
I have a web crawler and I have both scraping byte limit and timeout, so zip bombs dont bother me much.
https://github.com/rumca-js/crawler-buddy
I think garbage blabber would be more effective.