← Back to context

Comment by judge2020

5 years ago

> What's most interesting is that the bucket is private, so the only way they could identify that there is something malicious at a URL is if someone downloads it using Chrome. I'm assuming they make this decision based on some database of checksums.

Doesn't Chrome upload everything downloaded to VirusTotal (a Google product)?

> Doesn't Chrome upload everything downloaded to VirusTotal (a Google product)?

It doesn't, unless you opt for SafeSearch "Enhanced Protection" or enable "Help improve security on the web for everyone" in "Standard Protection". Both are off by default, IIRC. Without it, it periodically downloads what amounts to bloom filter of "potentially unsafe" URLs/domains.

On the other hand, GMail and GDrive do run the checks via VirusTotal, as far as we know - which means that OP case may have been caused by having some of the recipients having their incoming mail automatically scanned. It's similar for Microsoft version (FOPE users provide input for Defender Smart Screen), at least last time I checked.

The hashes of all things that match a "probably evil" bloom filter, yes.

Hosting a virus on a domain and then downloading it a few times with different chrome installations sounds like a good way to get the whole domain blacklisted...

  • That's why user uploads are worth some thought and consideration. File uploads normally gets treated as a nuisance by developers because it can become kind of fiddly even when it works and you are getting file upload bugs from support.

    It normally isn't that much of a challenge to mitigate the issues, but other things get priorities. Companies end up leaving pivots to XSS attacks and similar bugs too.

    • Google has a great service for this called Checksum. You upload a file checksum and it validates it against the database of all known bad checksums that might flag your website as unsafe. The pricing is pretty reasonable too and you can proxy file uploads through their service directly.

      I'm actually not telling the truth but at what point did you realize that? And what would be the implications if Google actually did release a service like this? It feels a bit like racketeering.

      3 replies →

  • I wonder if that could be triggered even when the certificate chain is not validated... you could MITM yourself (for example, using Fiddler) and make Chrome think it's downloading files from the real origin. In that case, an attacker could do that from multiple IPs and force Google to flag your whole domain.

  • Why isn't Dropbox blacklisted? Too big?

    • Dropbox actually provides an unique domain for each and every user - and separates the UGC from the web front code and Dropbox own assets that way - that's where the files you preview/download are actually coming from. I have no doubt a fair number of those is blacklisted.

      7 replies →

Sounds rather too resource-intensive? I've just tried with current Chrome on Windows and a 32MB zip on my personal domain, Wireshark says the file has not been sent anywhere.

  • Wouldnt it be more efficient to grab it in parallel to your download?

    • That only shifts the bandwidth cost between the original server and the user, Google's resources are unaffected. And it's not what GP claimed.

      I just checked the nginx access logs - both the 32MB and the 8MB zip files have been accessed only once (both were created only for this experiment).