← Back to context

Comment by judge2020

4 years ago

They don’t block CSAM because “it’s illegal” - in fact, they can’t be forced to do it without it breaking your 4th amendment rights. Instead, all CSAM reporting and blocking is done at-will by these companies, and some don’t participate (Apple[0]), so it’s a policy decision by these companies.

I imagine unblocking someone due to them being exonerated by a government entity is legally risky - perhaps doing so would be considered enough proof/evidence to deem the entire CSAM scanning practice as a search/seizure at the request of the government.

0: https://www.hackerfactor.com/blog/index.php?/archives/929-On... • “ According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.”

With FOSTA/SESTA, Congress found what appears to be a highly effective 1st/4th amendment bypass: companies do not receive section 230 immunity for “sex trafficking” material. The government doesn’t say “thou shalt delete,” it just makes companies civilly liable for whatever happens on their platforms, which could be a death sentence at their scale. This has been overwhelmingly effective at censoring anything that even looks like sex work, no matter how consensual. If the intent was to continue protecting human trafficking cartels from modern competition while proving a viable means of censorship, it has been overwhelmingly effective.

EARN IT threatens to expand this to CSAM. Meanwhile, anti-CSAM legislation continues to develop in other jurisdictions, and being global entities big tech companies are exposed to that risk. Hence why Apple published explicit plans to actively scan (albeit locally on-device and with creative use of cryptography to soften the blow) for CSAM on their devices, and integrate with NCMEC. They’ve shelved this for now, but made it clear that they’re not done with this concept.

Big tech dreads the loss of their liability shield, and these measures are an attempt to stay ahead of the policymakers. It’s not as free a choice as it may appear, and the federal government does not appear restrained by the constitution here.

Google isn’t the government, they are a third party who you provide data to. 4th amendment doesn’t apply.

  • The argument was Google cannot be legally compelled to do so (hence the violation), but they are voluntarily doing so while some others do not.

    • I’d hazard to guess that the nuance is how they process data.

      Google actively inspects and analyzes data, and does so with their cloud services. Microsoft does as well. They have a reasonable means to know that a user has problematic content. Other services that sync but don’t process data do not.

      In many US states, you are legally compelled to report child abuse and are subject to criminal sanction. In New Jersey it’s a misdemeanor for anyone who fails to report child abuse. In New York, mandated reporters are compelled to report and have criminal and unlimited civil liability if they do not (and immunity from liability if they do).

      For a company like Google or Microsoft, the risk assessment is very complex and the smart, and arguably the morally righteous move is to report.

How can data/files stored on somebody else's computer be subject to 4th amendment rights?