← Back to context

Comment by babanin

4 years ago

On my first job (in the beginning of the millennium) there was a limit on files you could download, something around 5Mb. If you wanted to download something bigger, you had to ask sysadmins to do that and wait... That was really annoying. So I and my colleague end up writing a service, that could download a file to local storage and chop it into multiple 5Mb attachments and send multiple emails to requestor.

After some time the limit on single file was removed, but daily limit was set up to 100Mb. The trick is that POP3 traffic wasn't accountable, so we continued to use our "service".

That sounds suspiciously similar to how I used to download large files on a shared 2GB/month data plan. My carrier didn't count incoming MMS messages towards the quota, and conveniently didn't re-encode images sent to their subscribers via their email-to-MMS gateway. So naturally, I'd SSH into my server, download what I wanted to download, and run the bash script I wrote, which split the downloaded file into MMS-sized chunks, and prepended a 1x1 PNG image to them, and then sent them sequentially through my carrier's gateway. This worked surprisingly well, and I had a script on my phone which would extract the original file from the sequence of "photos". It may still work, but I've since gotten a less restrictive data plan.

I couldn't download .exe files at some $CORPORATION. They had to be whitelisted or something, and the download just wouldn't work otherwise. But once you had the .exe you could run it just fine. You just had to ping some IT person to be able to retrieve your .exe.

Of course it was still possible to browse the internet and visualize arbitrary text, so splitting the .exe into base64-encoded chunks and uploading them on GitHub from another computer was working perfectly fine... I briefly argued against these measures, given how unlikely they are to prevent any kind of threat, but they're probably still in place.