Comment by soraminazuki

4 years ago

> A Google spokeswoman said the company stands by its decisions, even though law enforcement cleared the two men.

Wow. Just wow. This is worse than the usual Google's automated screw-ups. In this case, Google was notified of the issue by the NYT. Yet they actively chose to continue to screw over their victims just because they can.

> In a statement, Google said, “Child sexual abuse material is abhorrent and we’re committed to preventing the spread of it on our platforms.”

Just how tone deaf can Google be, continuing to treat these innocent folks as criminals in this passive aggressive statement even after being proven wrong? Do these people have no empathy at all?

I suppose it’s defensive behavior. If they admit their mistake now then they could potentially be liable for the damages caused by their mistake years ago. Now any lawsuit would need to determine if there was an error and harm instead of just quantifying the harm.

I’d like to contribute to a crowdsource fund to prosecute cases like this.

When I was a kid the Comic Book Legal Defense Fund [0] was set up to pay for lawyers to defend comic book stores that were being targeted by over eager police departments and civil suits.

Maybe something like the Google is an Asshole Legal Defense Fund could collect donations. The article mentions $7000 as the cost to prosecute this persons case. Crowdsourcing can help with that.

[0] https://en.wikipedia.org/wiki/Comic_Book_Legal_Defense_Fund

  • I am not saying it is right, but to a large degree this is the cost that some of us 'pay' for millions having 'free' Gmail/GDrive etc. Fully automated processes that close accounts, no due process to get them timely reinstated when the machine made an error. You are correct, if they admit a mistake here, it will open the doors to lots of claims. I sometimes think they could a lot of people to pay for the service (with $ not just having their digital lives being harvested) if they knew to be treated better when something like this happens.

    The question everyone needs to ask themselves, if Google closed your account right now, for good - what would that do to your life...

  • 7000$ is a pitance. Maybe this case is simple, but many will not be. Say they raid a house and confiscate a hard drive. Encrypted or not, that is going to be a huge thing. Arguments will be made about whether anything incriminating was stored on that drive. Just google the cost of a forensic expert witness. Both sides will need one.

    Such costs are actually why so many police agencies are backing off of CP investigations. They still prosecute where evidence is clear, such as when someone emails such material openly, but they arent willing to invest the tens and hundreds of thousands of dollars necessary to handle the complex cases involving encrypted communication/storage. 7000$ would be a bare minimum for only the simplest of legal defenses in the simplest of cases.

The article mentions two independent instances of this process within Google, where appeal is not possible even with a police report that completely exonerates the suspect.

It sounds to me as if a class action lawsuit is the most appropriate remedy for the unfortunates who are caught in this predicament. Their only problem is finding each other.

For the rest of us, it is unwise to use cloud storage for photos, for several reasons.

  • They don’t block CSAM because “it’s illegal” - in fact, they can’t be forced to do it without it breaking your 4th amendment rights. Instead, all CSAM reporting and blocking is done at-will by these companies, and some don’t participate (Apple[0]), so it’s a policy decision by these companies.

    I imagine unblocking someone due to them being exonerated by a government entity is legally risky - perhaps doing so would be considered enough proof/evidence to deem the entire CSAM scanning practice as a search/seizure at the request of the government.

    0: https://www.hackerfactor.com/blog/index.php?/archives/929-On... • “ According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.”

    • With FOSTA/SESTA, Congress found what appears to be a highly effective 1st/4th amendment bypass: companies do not receive section 230 immunity for “sex trafficking” material. The government doesn’t say “thou shalt delete,” it just makes companies civilly liable for whatever happens on their platforms, which could be a death sentence at their scale. This has been overwhelmingly effective at censoring anything that even looks like sex work, no matter how consensual. If the intent was to continue protecting human trafficking cartels from modern competition while proving a viable means of censorship, it has been overwhelmingly effective.

      EARN IT threatens to expand this to CSAM. Meanwhile, anti-CSAM legislation continues to develop in other jurisdictions, and being global entities big tech companies are exposed to that risk. Hence why Apple published explicit plans to actively scan (albeit locally on-device and with creative use of cryptography to soften the blow) for CSAM on their devices, and integrate with NCMEC. They’ve shelved this for now, but made it clear that they’re not done with this concept.

      Big tech dreads the loss of their liability shield, and these measures are an attempt to stay ahead of the policymakers. It’s not as free a choice as it may appear, and the federal government does not appear restrained by the constitution here.

  • To be fair, users of many internet services exposed themselves. The warnings about that were loud and clear. I don't know if using their free service counts as a formal business relationship, that might make recourse more difficult. I think the service provider has the right to close any and all relationships unilaterally.

  • Cloud storage for anything valuable.

    Note that it was the telehealth provider who provided the images to Google in the first place, not the SF techie dad.

What's the incentive for Google to EVER give the accounts back? If they wrongly deactivate an account (like here), you get a bad article, EFF and friends ruminate about your behavior, and the world mostly moves on.

If Google wrongly gives an account back, you get a different article: "Google helped child pornographer even after discovering CP in their account". Now that gets attention. That's a scandal that leads to political action, criminal charges, etc.

To be clear, I'm not advocating for how Google behaves. They're a lot more like a utility and probably should be treated like one (alongside the protections and requirements that come of a utility, you don't hear "eletric company stopped serving house since man suspect of CP lives there").

For the responses saying "Well the police cleared them", again I don't disagree. But if you're an executive making this decision you're thinking:

1. We never give back an account in this case and avoid the massive downside risk

2. We go through a lot of work to design a process that will impact a marginal portion of customers and really really hope nobody manages to social engineer themselves past, and pray that no enterprising news outlet/politician tries to make the "Google helped CP person recover their CP story" - they already have a target on their back.

  • In what world would Google receive criticism for giving back accounts to people who has been proven innocent?

    Google's surveillance system and automated ban hammers are already bad enough. But the actions they took following the ban in this case is egregious and 100% indefensible. At the very least, Google could've reinstated their victim's account and issued a full sincere apology upon being contacted by the NYT. If Google has any care for their users, they'd do that for every people they wrongly reported who had their names cleared. Instead, Google doubled down, continued to treat their victims as criminals in their statement, and even leaked details about intimate photos in a blatant attempt to discredit the users they wronged.

    No parent should ever have to go through what Google has put them through when trying to cure their child. Most of all, they should never have to risk losing custody of their child because their child went sick. They should never lose access to their whole digital identity because they didn't know any better than to rely on Google. Yet this is what Google did to these parents, full stop.

    • >> If Google has any care for their users, they'd do that for every people they wrongly reported that had their names cleared. Instead, Google doubled down, continued to treat their victims as criminals in their statement, and even leaked details about intimate photos in a blatant attempt to discredit the users they wronged.

      Google's users are advertisers. Ordinary people are data sources and ad viewers. Android phones and Chrome are for collecting data about people and showing them ads.

      Google doesn't care about losing two data sources and ad viewers.

    • I edited my comment on this point and largely I agree, but as far as anyone in Google's position is concerned:

      1. This is a rare edge case with asymmetric risk and an easy way to avoid said

      2. A single failure could be catastrophic

      3. No process is perfect especially in the face of someone actively trying to thwart it

      You could try really hard to make sure it's not attackable and pray, for little benefit to Google. Or just tell people "too bad, you're on your own".

      Even if this processes was assuredly bulletproof,

      > In what world would Google receive criticism for giving back accounts to people who has been proven innocent?

      The sort of drivel/clickbait that gets published to 'make a good story' is astounding. Or politicians, not known for honesty, could totally play this up if their base has an axe to grind against Google.

      No giant corporation (especially publicly traded) is going to take such risks to revive a few wrongfully terminated email/photo accounts unless they have an obligation to (i.e. should Google be a utility). Their responsibilities aren't to be nice, or act in a good way, but to protect their profits. To the extent that I've seen corporations 'act nice' it's largely been to benefit their own employees or pursue some pet cause of an executive.

    • Is "proven innocent" a thing in the US?

      (Your use of "full stop" suggests you're not American either - so I'll rephrase, "is proven innocent a thing in any English speaking country"? I actually don't know...)

    • Devil's advocate:

      Proven innocent is not a thing, there is only "declined to prosecute [because there was no crime]."

  • If due process was followed, and the police / state exonerated the parents, I don’t think anyone would blame Google for reactivating the accounts. They’d blame the police or whatever flawed exoneration process was used.

    At least, that’s what I’d hope.

    Google here looks even worse than I thought possible, and I’m a de-Googled, anti-fan, so I already had a very dim view of them.

    • What if the police don't feel it's interesting enough yet (or don't have funds) to investigate and put things on the backburner for lack of resources? Similar to the mountains of unprocessed rape kits.

      These are interesting cases but it seems like they happen to be ones where the police bothered do anything.

    • What's this exonerated?

      They didn't do anything. Investigation/State scrutiny shouldn't be considered an act of guilt that requires a vouchsafing.

  • What's the incentive for Google to EVER give the accounts back?

    Ethical, moral responsibility? Being a nice entity. And it takes zero effort to do so, especially once he is cleared by the cops?

    You know, stuff like Don't be evil? Oh wait...

  • Employees of this office are very small and delicate, deserve protection from local pervs. Better a thousand innocent men are locked up than one guilty man roam free.

    — Dwight K. Schrute

  • Google is not anything like a “utility”. A utility has a natural monopoly because of the effort, expense and disruption that is required to lay down the infrastructure and the need for scale. There is no product that Google has that you can’t and shouldn’t pay for a competitors product.

Welcome to fascism.

A private company is the de-facto judge, jury and executioner because it owns half the infrastructure you live your life on.

  • With the flick of a switch, they can deploy this technology to a global scale and way beyond protecting children. Imagine the feature propagating to their Chrome browser or their smart speakers "listening in" on what's happening in your home to "prevent" crimes by sending the cops whenever you raise your voice or say "the wrong things." This kind of power should not be in the hands of a single company.

    Edit: I recently read this book called "The Every" which explores a similar scenario https://www.goodreads.com/en/book/show/57792078-the-every

    • Or any company, or groups of companies. The entreprise is a necessary evil, it should be treated as such and not be tristed with anything.

    • >I recently read this book called "The Every"

      Reminds me of another "Social Media Corp Gone Bizarrely Crazy" plot that I read recently, The This, by Adam Roberts. You won't anticipate how Crazy the social corp in question goes till you're halfway.

      1 reply →

  • And you can’t even escape these companies by moving to another country. They have their tendrils everywhere, except in places that consciously prevent them - which is usually done by even worse tyrannies like China/Russia.

    Always makes me laugh when I see employees of Google, Amazon etc. claiming to be “anti-fascist” and “standing in solidarity with the common man” etc etc…

    • >Always makes me laugh when I see employees of Google, Amazon etc. claiming to be “anti-fascist” and “standing in solidarity with the common man” etc etc…

      Having had the pleasure of sharing the elevator with said individuals in shared office buildings the "common man" part almost made me lose my coffee.

      2 replies →

    • >claiming to be “anti-fascist”

      I can't fault them too badly for that. I consider myself personally against the horrors that go into gathering the constituent materials for, and the assembly of, computers. This position co-exists with the fact that I, like most others here, derive the majority of my livelihood from Doing Things With Computers, which wouldn't be possible without the computers themselves, which wouldn't be possible without the horrors.

      Worse still, I live a provably better life than those of my peers whose youth did not, for whatever reason, revolve around Doing Things With Computers, let alone the far-flung people whose labour provides me with the computers on which I do those profitable things.

      It's difficult to morally reason about.

    • And you can’t even escape these companies by moving to another country.

      In my personal view the internet became one big country with overlapping laws and government policies and collusion by corporations. Ever so often I use google to see what search results they can muster but otherwise I don't use cloud storage and do not find myself depending on google. For email I try to teach my friends how to use Thunderbird so they can click a button and their email is mostly GPG encrypted.

      If I wanted to share files with someone that will not fit in that GPG encrypted email then I plonk them down on a mini-PC or a VM and share them over HTTPS with cache-control headers to reduce risk of file caches or SFTP with authentication. This is just my own preference and I am a stodgy cranky old bastard but if someone decides that basic auth is too much friction then it was not important for them to receive the files. I implement my own data retention and destruction policies. How my doctor or lawyer decides to store the files is up to them. I can only hope they are wise enough to not store things on their fondle-slab. An intelligent doctor should be able to handle basic authentication and/or be able to follow simple instructions for creating and sharing a GPG public key with me.

      Or I could just snail-mail them an encrypted USB drive, however a GPG encrypted email should suffice for sending a few little images to a doctor. Some will bring up rubber hoses and wrenches but there are mitigations for such things. Some might even want legislation to to bandage these dark patterns but experience has taught me to not trust that corporations would be held to account at the same level and standards as citizens.

    • > And you can’t even escape these companies

      So you are forced to use an iPhone or Pixel phone, forced to search with Google, forced to use AWS.

      Even though open alternatives exist for all these who would love your support.

    • What if I use Huawei phone, which is banned from using Google services? Obviously, China will have all my info, but they can't (yet) ban me from boarding US trains for low social rating, so what do I care?

      4 replies →

  • "The word fascism has now no meaning except in so far as it signals 'something not desirable.'" --George Orwell

    • Yeah also think the word/concept fascism doesn’t fit here. While it is obviously oppressive or unjust, it is a company, so what would be a better fit?

      21 replies →

    • This quote is taken out of context FYI. Here is the full version:

      "The word Fascism has now no meaning except in so far as it signifies ‘something not desirable’. The words democracy, socialism, freedom, patriotic, realistic, justice, have each of them several different meanings which cannot be reconciled with one another. In the case of a word like democracy, not only is there no agreed definition, but the attempt to make one is resisted from all sides. It is almost universally felt that when we call a country democratic we are praising it: consequently the defenders of every kind of régime claim that it is a democracy, and fear that they might have to stop using the word if it were tied down to any one meaning. Words of this kind are often used in a consciously dishonest way. That is, the person who uses them has his own private definition, but allows his hearer to think he means something quite different"

  • This is at best authoritarianism or corportatism not fascism. There is no ultranationalism, there is no "othering", there is no enforced hierarchy of individuals.

  • That's not what fascism is. Fascism is hyper nationalism where everything is done in service of the nation state. Private entities only exist insofar as they are extensions of the state.

    What you're describing here is more like a cyberpunk corporatocracy, where corporations hold so much power independent of the state, that they are able to exercise their own arbitrary decisions extrajudicially, while still maintaining so much power over people so as to completely control their lives.

    In fact, here you can see that the person was exonerated by the state but punished by the corporation. In fascism, nothing supercedes the state.

  • Unfortunately there are some issues that people get so angry over, they'll support fascism openly if it means being against it

> Just how tone deaf can Google be, continuing to treat these innocent folks as criminals in this passive aggressive statement even after being proven wrong? Do these people have no empathy at all?

They don't care a single bit about the effect their actions have on others. They only care about having to build a system, which can distinguish such cases from actually criminal ones. Because that wouldn't scale and would be bad for business $$$. So they try to turn and twist the image in the light of the public, that it is "right" what they did, so that the public does not cry out and demand change of their systems. Empathy doesn't even enter the equation for Google.

  • > They only care about having to build a system, which can distinguish such cases from actually criminal ones

    There are only two ways to actually do that:

    1) Make Google's policies 100% subservient to the United States legal system, which would look a lot like the "corporate / national lock-step unity" one sees in actual fascism

    2) Google build its own court system, independent from the United States court system but with equivalent power

    Are either of those scenarios desirable?

    • How about just saying in hindsight: "Hey, we were wrong! [Yes, it's possible! The allmighty Google can be wrong! Newsflash!] We are sorry, these things happen with our automated systems. Person XYZ is completely not guilty of our accusations and we will revert all actions taken from our side. It is in the nature of things, that these cases are difficult to distinguish, without information about the child as well as the parent and we need to stay vigilant about pictures of children being uploaded to our services. Please accept our apologies."

      Instead of going: "Nooo! We were still right! We don't care what others say or what facts were found out!"

      So I am thinking you are painting the scenario a bit wrong here, saying, that there are only 2 options. An honest apology and actions to make up for ones mistakes can go a long way. Of course I would not expect Google to act that way.

I’m amused because it’s a microcosm of the actual problem: it’s probably the default response to anything that has to do with child sexual abuse material, given without thought to context or circumstances, with too little review. But hey, I guess it’s Google’s official position that this dad is a child pornographer ¯\_(ツ)_/¯

> In a statement, Google said, “Child sexual abuse material is abhorrent

I love the use of the disclaimer "sexual" here, to make it clear they don't care about other types of child abuse (like interfering with access to health care, which Google is clearly guilty of in this case...)

Well crafted weasel words, PR folks!

I wonder why the person decided 7k was not worth the lawsuit. Their legal counsel told them it was hopeless to win anything?

I’m more and more convinced that they just never bothered to write the ban-hammer feature in a way that it is actually reversible.

Probably the button simply doesn’t exist to undo the termination 30/60/90 days later. Good luck getting them to admit it.

even though law enforcement cleared the two men

I've said it before and I'll say it again: Google now has enough power that it has effectively turned into a globalist government, a government you did not vote for.

What do you think is the ratio of innocent to nefarious pictures of naked children Google encounters in aggregate?

This is relevant to how outraged one should be by this story. I think it is probably > 1:100000. As such, probably not much outrage is warranted, although it’s obviously not great for this one guy.

  • Wow, I'd guess the opposite when we consider the base rate. Seems like a classic Bayesian problem.

    It's fairly normal for parents to take pictures of their children naked in the bathtub/at the beach/ camping/etc.

    Conversely, I'd expect actual pedophiles and CSAM producers to be really quite rare.

    So even a relatively low base-rate of normal parents with normal nude photos would likely dwarf CSAM upon detection.

    So, if we say 1/100 are pedophiles, and 30/100 are parents, and of the parents 10% have such photos, the ratio I'd expect without getting into detection rates is like 3:1 in favor of normal parents.

    • > It's fairly normal for parents to take pictures of their children naked in the bathtub/at the beach/ camping/etc.

      And what I understand is that each of these private photos of their children that parents take on an Android phone with Google cloud enabled gets specifically flagged to be shown to a stranger working for Google.

      That sounds pretty insane to me.

      4 replies →

    • If the types of photos you mention were getting flagged as per the case in the article, we would be hearing about a lot more cases. The case itself involved a medical photo, not smiling children at the beach. As such the denominator you use isn’t relevant here.

      There is a lot of child pornography in image sharing platforms. Facebook, which reports most reliably, had 20 million CSAM reports in 2021, and that’s photos posted to a social network or sent via messenger, not even in private albums. As I understand it CSAM is more focused on reporting re-uploads or transmission of known abuse material, rather than new material. So I still maintain that if we take the type of image referenced in the article as the denominator, vastly more of such photos would be child abuse material. Granted, 1:100000 is too high, I would revise that down to 1:1000.

      2 replies →

  • If it wasn't clear, people aren't objecting to the use of automated tools to prevent crime. It's that there is absolutely no avenue of appeal or review against it even if the law enforcement exonerates them and a big news media reports it.

    Google has inserted itself into almost all spheres of digital life by hook or crook. It's practically difficult to avoid them in many services - especially email. And now they play judge, jury and executioner. I don't understand any of these are acceptable, much less justifiable. The old argument 'think of the kids' used to justify digital authoritarianism is such a cliché by now.

    • > big news media reports it

      The guy in the article works in the tech industry and lives in a city where a lot of journalists live, so he could get his story into the media.

      What about others?

  • I’d put pretty good money on almost every family having pictures of their naked kids doing some shenanigans. I know I have those pictures. My parents have those pictures of me and my sister. Quite a few, as I seemed to enjoy trying to run about naked…

    Bill Waterson somehow managed to sneak watercolor paintings of a naked little boy into every major newspaper under the guise of being a “comic strip”— the perv.

    • A lot of people don't, specifically because they are aware of the tremendous risk of an insane government or corporation ruining their life over innocent behavior.

      Dare I say that it is a peek into how Black people feel when they go out in public to do...anything.

  • Somehow I think there more parents who sometimes need to take a photo of their naked children than there are paedophiles.

    Or at least the ratio is clearly not 1:100000, maybe more like closer to 1:10.

    You would need statistics how many times google have reported police and how many times it have turned out to be a false alarm. Does google even keep record of false alarms? Most likely they don't to avoid responsibility.

    • I'd expect it to be a lot lower than 1:10. All parents have nude children at some point, they're sort of made that way. The number of pedophiles is higher than anyone wants it to be but it's not 1:10 the number of parents or they'd be a voting bloc. Probably 1:10000

  • Well as a data point I have pictures of my children naked. As another data point my parents have photos of me as a child naked, and as a third data point my grandparents have photos of themselves as children naked. Whereas I don't knowingly know any paedophiles.