← Back to context

Comment by wewewedxfgdf

7 months ago

I protected uploads on one of my applications by creating fixed size temporary disk partitions of like 10MB each and unzipping to those contains the fallout if someone uploads something too big.

`unzip -p | head -c 10MB`

  • Doesn't deal with multi-file ZIP archives. And before you think you can just reject user uploads with multi-file ZIP archives, remember that macOS ZIP files contain the __MACOSX folder with ._ files.

What? You partitioned a disk rather than just not decompressing some comically large file?

  • https://github.com/uint128-t/ZIPBOMB

      2048 yottabyte Zip Bomb
    
      This zip bomb uses overlapping files and recursion to achieve 7 layers with 256 files each, with the last being a 32GB file.
    
      It is only 266 KB on disk.
    

    When you realise it's a zip bomb it's already too late. Looking at the file size doesn't betray its contents. Maybe applying some heuristics with ClamAV? But even then it's not guaranteed. I think a small partition to isolate decompression is actually really smart. Wonder if we can achieve the same with overlays.

    • What are you talking about? You get a compressed file. You start decompressing it. When the amount of bytes you've written exceeds some threshold (say 5 megabytes) just stop decompressing, discard the output so far & delete the original file. That is it.

      16 replies →

  • Seems like a good and simple strategy to me. No real partition needed; tmpfs is cheap on Linux. Maybe OP is using tools that do not easily allow tracking the number of uncompressed bytes.

  • Yes I'd rather deal with a simple out of disk space error than perform some acrobatics to "safely" unzip a potential zip bomb.

    Also zip bombs are not comically large until you unzip them.

    Also you can just unpack any sort of compressed file format without giving any thought to whether you are handling it safely.

    • I'd put fake paper namers (doi.numbers.whatever.zip) in order to quickly keep their attention, among a robots.txt file for a /papers subdirectory to 'disallow' it. Add some index.html with links to fake 'papers' and in a week these crawlers will blacklist your like crazy.