From my travels, my impression has been that America in particular treats child nudity as completely, unexceptionally obscene, beyond even adult nudity.
Compared to a beach in Europe, where nearly half of children under 2 run naked, there seems to be no grey area or minimum acceptability in the US.
It makes me wonder if our hypersexualized treatment of child nudity /actively contributes/ to the sexualization of children in our culture.
Haha yeah, in Europe there are often naked children running around at the beach. In Germany, there are places (especially lakes) where everyone is walking around, swimming and sunbathing naked - Sometimes the whole family. For me it was quite a shock initially but now I think it's fine.
I find that a bit weird tbh. I mean, if we look at human tribes throughout the world/history, and even what is the most common nowadays in african tribes which maintain their dressing habits, they usually cover their genitals. It indicates to me it is human nature to do that. It seems to me, in some countries for several reasons, they are going against the grain, purely out of culture.
But there's something weird about it. Just like when you look at previous periods in time and now they look weird, often just reactions to previous epochs or other cultures (like the neoclacissim came as an opposition to rococo and baroque).
This feels like it. It doesn't feel natural. It feels like it is purely out of a contrarian way of thinking, about how forward thinking they are, "look how superior we are that we are completely void of instinct to cover ourselves". and perhaps somewhat of a sexual counter balance to the strictness of the rest of their culture. I'm not sure i'm explaining this in the best way. I think there is a natural middle ground, the one that we have seen throughout hunter gathere human history, and Nordic countries are just being culturally weird by going to one extreme.
I don't think it's quite as strict as you're making it out to be.
I was in a park in Manhattan last week, which had a bunch of big sprinklers for kids to run through. No one was naked in the sprinkler, but some parents helped their children change in/out of their bathing suits out in the open.
(Then again, I only remember this because I was a tad surprised by it.)
in japan, children play a game called Kancho, where they put their index fingers together and try to insert it into the anus of their unsuspecting friend.
in america, you could destroy someone's life over that. (possibly even multiple lives)
A different article recently prompted me to wonder (I'm American): We know exactly at what age someone becomes 'legal' wrt images - do we know on which day of their life a person first becomes 'illegal'?
In the us, there is an unspoken belief that preventing criminals from ""winning"" is more important than protecting citizens.
This is why police fired into a stolen minivan that had a unrelated child and a shoplifting/carjacking suspect in it, killing both. Protecting the child took a back seat to punishing the prep.
Therefore CP laws very quickly stopped being about protecting the children and more about punishing the pedophiles.
Vindictiveness and spite are the unspoken role models of the US justice system.
Nudity in kids is seen as obscene purely because a pedophile could find it arousing. We are preemptively sexualing children in order to prevent their sexualiztion.
A child was once forced to masturbate in front of a camera by COURT ORDER. Police held them down. All because the child was a victim of child pornography and the court wanted a "comparison image" in a similar arousal state to prove it was a image of that child.
Once again, a child was victimized by the justice system, in order to punish a perp.
Never make the mistake of thinking anybody in the justice system gives a rats ass about protecting anyone.
> A child was once forced to masturbate in front of a camera by COURT ORDER. Police held them down. All because the child was a victim of child pornography and the court wanted a "comparison image" in a similar arousal state to prove it was a image of that child.
That summary is misleading and, I feel, wrong. It's not accurate to state that he was "a victim of child pornography," the complaint in this case was against him by the parent of another minor that he was "sexting" with.
That doesn't excuse the police behavior here, but you're attempting to paint a picture where the victims in these cases are outright ignored in a misguided search for justice. You're twisting this case[1] to fit your narrative, I think.
Further, he sued the government and won. The courts made it perfectly clear, the lower courts and police were absolutely in violation of this teens rights when it granted and executed this search warrant. So egregiously that "qualified immunity" doesn't even apply to the officers estate.
I wasn't going to address this, but..
> Therefore CP laws very quickly stopped being about protecting the children and more about punishing the pedophiles.
Pedophiles create a market through demand. Often it is also pedophiles that are on the supply side of this market, but not always. Merely participating in the demand side implicates you in these crimes against children as you are suborning their abuse. One could say that our system merely recognizes this fact.
> Not only did he lose emails, contact information for friends and former colleagues, and documentation of his son’s first years of life, his Google Fi account shut down, meaning he had to get a new phone number with another carrier. Without access to his old phone number and email address, he couldn’t get the security codes he needed to sign in to other internet accounts, locking him out of much of his digital life.
> “The more eggs you have in one basket, the more likely the basket is to break,” he said.
I only have a google account to have access to Play store. Everything else, mail, calendar, photos and storage, has been moved off Google
Same here. Those shitty providers are a too high risk.
Many years ago Microsoft has banned my outlook account for no reason. I tried contacting the support to resolve the issue - nothing. Only after writing a letter, I received the answer from Microsoft Germany:
We banned your account because I was distributing porn through onedrive (which I was not doing). The investigation showed that Microsoft could not find any wrongdoing. The account was gone nevertheless.
This is why I say:
- skip Gmail/Outlook and all this crap, use something proper like mailbox.org with your own domain
- dont trust cloud. Backup once a month, use two clouds at the same time, have a cold storage with your backups
- biggest risk is your apple icloud account. dont be dependant on them: dont use their credit card
- dont use google/ms 2fa apps
- have multiple accounts in parallel: one for google drive, one for search, one for mail (if you need it), one for youtube, etc
Isn't the ability to migrate phone numbers a legal requirement for carriers?
Edit: yeah it is [1]
> Phone companies are required by law to port your number out when you start service with a new carrier. According to the FCC, a company can't refuse to port your number even if you have an outstanding balance or unpaid termination fees.
When porting to a new carrier, the new carrier asks for your account information and PIN for the old carrier. The new carrier provides that to the old carrier to prove that the port request came from you.
It looks like with Google you are supposed use the Google Fi app or website to obtain a “port out number” and PIN [1] to give the new carrier when they ask for account number and PIN.
If you accounts have been banned I wonder if you can still get that information?
This is why I’m something of a two factor authentication Luddite. I (think) I understand the benefits - but the fact that every factor is an additional single point of failure freaks me out.
TOTP keys, backed up physically (as in, written down and stored somewhere safe), are probably a good way to balance the failure mode with the security.
> The more eggs you have in one basket, the more likely the basket is to break
This. Use Yahoo for your email, Verizon for your phone number, Dropbox for your backups, Facebook for your socials… It doesn’t matter which you choose, but fragment your digital life. When one part goes bad, it won’t infect the others.
Honesty, I wish self hosting was easy enough for ordinary people to do. Then there would be no reason to recommend people to fragment their digital life among dozens of ad-funded SaaS platforms. There is no reason why it has to be so hard. But not enough effort has gone into developing such systems due to obvious reasons. Sandstorm project, for example pioneered a model where users could deploy web apps like they would install mobile apps on their phone. Sadly, it didn't achieve the level of popularity it needed.
I'm still totally not off Google yet but this is the direction I'm going in. Fastmail for email (still need to do the POP/IMAP setup for local caching using Thunderbird), local ISP for phone number, Syncthing for syncing across PCs/Android phones (except iOS which lives in its own world) and Facebook has been deleted long ago. Just takes a while to get off Google because of years of sign-ups but I already got the financial and government services off it.
When I started using Google, their motto was still "Don't be evil." My entire business and personal life runs on it, so I have no way to switch. 300 employees emails, all documents, videos, tons of custom scripts would have to change over.
The biggest problem with Play store is that many apps are geographically restricted. That includes banking and public transport apps. Some apps are restricted without even the knowledge of the developer. Most of those apps have no good reason at all to be restricted like that. Such apps often cannot be accessed through Aurora. Sadly, there are no laws or provisions to challenge such unethical tactics. Using Google's services is like living with a manipulative and abusive partner.
Nowadays you can buy a NAS and plug it into your router and then click the install nextcloud option and you have it running, then you add the app on your phone. Its no where near as difficult to get these level of capability as it used to now that Docker is integrated in a user friendly way in a lot of NAS software.
The biggest failure here is Google's continued refusal to segment their services and this has been a problem since at least the Google+ days, which is more than a decade ago at this point.
During Google+, Google controversially instituted a "real name" policy. This was controversial and completely driven by Vic Gundotra who would get up at company meetings when asked about this (back when he still answered questions because believe me that ended) and said "we don't want people named 'dog fart'".
Legitimate concerns that people might have for their safety were completely brushed aside.
Anyway, the enforcement for this of course was automatic and resulted in I'm sure many false positives. But what happened when your account was flagged/ You lose access to everything.
This too was criticized at the time and people asked "well if it's a policy violation for Google+, why do people lose their Gmail?". These questions too were completely brushed off.
At this time I decided I simply couldn't and wouldn't use any other Google service because my Gmail is too important to risk by an automatic ban from a false positive on a completely unrelated product.
And here we are, a decade later, with the exact same nonsense happening.
Now of course the CSAM ban itself is ridiculous. Refusing to reverse it is ridiculous. All of that is true. But don't tie your email or your phone to a company's other services when they do blanket bans like this.
What I dont understand is why the stupid policies established by a coupled of troubled individuals years ago for failed projects are still in use today ...
Give us de +Word back... dont "improve" the stupid quotes.
This story should serve as a warning and a motivation to move away from iCloud, Google Cloud, and alike for video and photo storage.
The easiest thing is to just purchase a drive or NAS and store your media on it. If you're a bit tech-savvy, you can run your own NextCloud. I run NextCloud on Hetzner, 1TB storage box + web server for a total of $7 a month. Or you can get Hetzner to run the NextCloud for you for about $4.7 a month for 1TB storage, but then you don't have full control over it. Then there are quite nice NextCloud apps for Android and iOS that you can configure to sync your photos and videos into your cloud.
Not OP, but I also use Hetzner and their storage boxes. Servers I use will connect using samba. For temporary access from other devices I usually use FTPS, but they also offer a bunch of other protocols.
Can NextCloud do OCR on the pictures, identify people and allow search by objects in the photo? Because those are the killer features for me that make it super useful to use Google photos.
I am a software professional and run my own email and it's a massive pain in the ass. I started also using a google account just to use for all the stupid sign ups the internet makes you do, and use my real email for actual correspondence, banking, and a few other things only.
I worked in a 1 Hour Photo when I was in college. The standing rule we had is that any photos of naked adults were not printed (the customer was given their negatives though). In the case of naked children if it was the typical "kid taking their first bath" it was fine but if there was any doubt we had a manager review it. I think we had to call the cops a couple times (more for passing the buck than making an official decision like that ourselves) but there is/was a policy for things like that.
One example would be protecting employees from someone just bringing in photos of themselves with the sole intent of exposing themselves to the employee, as they have a captive audience.
I keep telling anyone who will listen that they need to move every aspect of their online identity away from the big tech giants, and make sure that each type of service (email, webpage, memory storage, document storage, backups, contacts, etc) is compartmentalized in such a way that if one gets removed then it wont affect the others in any way.
Imagine having all your memories in the form of images and videos taken away because of sloppy review work at a tech company.
I want to, really I do, but it feels like trading one devil for another.
Email is by far the biggest albatross around my neck. To migrate email requires me to setup two different failure points: purchase a domain and a Fastmail-like service. Now that is two different places that are subject to me forgetting to pay a bill, being social engineered into giving away my account, etc. To say nothing of the long tail of acquaintances who only know be my current address.
Yet still, this existential terror exists that I will Do Something Wrong (no you will never know what it was), and lose everything.
E-mail should be the first thing you migrate, precisely because it is the key to every other service you rely on. If you're randomly banned from your E-mail service, you're pretty much screwed. E-mail is too important to let someone else host. Move it as close to your own control as your technical ability allows. Do it today, in fact, do it right now.
Autopay is simple to set up with any hosting provider. If you're in a financial place where the $5/mo and $10/yr payments might bounce... yeah, probably don't pay for email.
As far as the social engineering goes, personally I gladly take that risk in order to bring my email firmly into my control. If I'm going to lose access to my email, I want it to be because I did something stupid, not because an algorithm flagged my account and Google has no humans I can appeal to.
The long tail of acquaintances isn't bad either: just forward emails from Gmail to your new address. If you lose access to the Gmail account at some point, you'd have lost everything anyway, so this arrangement would be strictly better.
Do you use the same email or 2FA for all of those? If so, you still have a potential single point of failure if that provider decides to ban your account.
That sounds sorta expensive but I get it. I’m at 3tb of photos and videos. It’s becoming unmanageable… anyone aware of an self hosted multi redundancy photo storage that doesn’t involve manual copying of large amounts of files? Or other non time consuming setups that works with iPhones (me and my SO)?
This event happens so rarely that it would be better advice to tell people to wear a bullet proof vest and helmet when going outside. It’s more likely to benefit them than self hosting media.
Also: I wear a helmet or a belt every single time I drive even for 1km. Should I not? I never needed them. 0% use rate so far for me, might as well not have done it.
The issue is that when it happens, you're done.
Considering that you probably don't need to keep all your eggs in one basket, why not at least try?
On this note, I wish providers were forced to provide a third-party backup mechanism for important data so that if I were to lose:
- the email: it could forward my email to at least receive them
- my photos: a copy could be kept on multiple hostings
Anecdote: It's happened twice for me. Once was for an app I published they decided was against ToS, two years after I published it. And the other was an accidental account deletion when upgrading Google Apps (luckily this one was mostly recoverable... if you know the right people at Google).
HN should add a [Corp Support] tab, if Google & Co won't create their own support system, let's do it here, and add a dark pattern leaderboard for not replying/acting
I’ve heard great things about Google Domains but this kind of story is exactly why I probably won’t be using that service. It’s just too risky if you lose everything at once.
Honestly: don't upload unencrypted content to anyone, for exactly this reason.
I have cloud backups of family photos, but they're all through restic or rclone with the crypt filter applied. Privacy is about the right to put yourself in context.
Sorry, 99.99999% of the general population don't know what restic or rclone is. In fact, I won't be surprised if 90% of software engineers have never heard of them. These things aren't really know outside circles like hacker news.
Yeah, the way Google likely ties your accounts all together a wrong decision on any Google account even if not the same account for your domains could end up having all your domains stolen by Google.
What I'd like to know if if they actually deactivate multiple linked accounts when any one of them gets flagged. I have three accounts, one more for personal things, one more professional, and a third one with my current country as a location for getting local apps in the Play Store.
Google knows I am the same person, even though they are different accounts, so are the two other accounts safe when one of the three gets flagged?
As a basic consumer storing somewhat not important things on Google is potentially risky for the content on it but you aren't paying them anything so its probably worth the trade off. For a business or for something you pay for Google's support is atrocious and its not worth the hassle given all the horrendous failure modes it can put you in. One thing Google consistently is teaching people is do not pay them directly as they don't know how to treat their customers.
> Mark spoke with a lawyer about suing Google and how much it might cost.
> “I decided it was probably not worth $7,000,” he said.
I believe it is one of the roots of the problem. How is it even possible that getting justice in court in such a trivial case costs about three months of median income?
Well, it's really not straightforward to know what would be the outcome of such a lawsuit. Google's ToS is one thing but the main point is that random user uploading family photos is not a protected group, so Google refusing service is very much their right. The state usually cannot force otherwise.
The real problem here is companies are not cops and should quit acting like cops.
The instant a company has evidence of a possible crime being committed they should be required to hand the evidence over to the police and then take no other action other than preventing distributing it or the like.
This is not just Google's AI goofing up on what constitutes CSAM (and it sounds like given the witch hunt about such things that Google was being reasonable in informing the police), but colleges expelling "rapists" without evidence etc. The accused never gets anything resembling a fair trial but since it's not the government doing it that doesn't matter, there's no repercussions from messing up lives based on completely incompetent investigations.
They may not be cops but they’ve created an enabling technology. They’re also the only ones who could access the data and recognize its potential for abuse. It’s not an easy situation.
But clearly if they’re referring out to law enforcement, they need to close the loop on that and take responsibility when they get it wrong.
This is more than just Google and CSAM. We have a more general problem with companies playing cop--and generally doing a terrible job of it. This case is simply one example of the problem, we should be focusing on the bigger picture.
It's time for the people to decide how they want companies that provide utilities to behave, and time for utility companies to stop telling the people how to behave.
In the olden days, if the AT&T monopoly just cut off phone service to a (convicted in court) pedo, they would get in severe trouble. We the people imposed limits on powerful companies. Even today, with the monopoly split up, this would not be legal. Let alone just deciding on their own initiative to do it.
In this case, a utility provider is cutting off service based on a digital rumor. They are judge, jury, and executioner.
The laws governing telcos were made over a period of 150 years, but most particularly in the 1920s and 1930s.
Google does not fit these laws because they do not charge for them (perhaps this should be made illegal?) and monetize them differently. Also, obviously the services are far beyond simple voice or fax. And yet, they are definitely utilities.
Utility companies must not be politically partisan or active. Mixing those two things is toxic and bad for society. It also is too much of a temptation for politicians to use the implied power of utilities over the people to silence or supress opposition.
If Google wants to be an activist company, then it will need to shed its utilities. If Google wants to provide utilities, then it needs to shut down its activism.
Google's arrogance here is astonishing. The police say no fault, and it's the subject of an NYT investigation, and they still won't restore the account. What hope do the rest of us have?
I've been an Android user for a long time, but I think this might finally push me to switch to Apple. I'm just disgusted by this.
The Eu starting to fine the living daylights out of Google for not allowing people get their data from their account per the new digital gatekeepers act that the big tech has 6 months to comply.
They did it to Microsoft when Microsoft refused to comply with the browser ballot box initiative, saying that it was "technically near impossible". They started fining them 500,000 Eur/day or something. Then Microsoft magically made the ballot box happen within 2 weeks.
Apparently having a "0 tolerance policy" on something means that even when an accusation is proven to be false, you'll still punish the accused. I am disgusted and Google should be ashamed.
Also, it looks like a powerful attack vector. Just slip a questionable content onto a victim's phone - and voila, a lot of trouble is under way, probably irreversible.
Google has a responsibility, to a limited degree, to turn over to law enforcement anything that they know about abuse that comes through their system. More likely, the trigger for that is set really low as a corporate CYA and to pass the buck. I can totally see Google's point of view on this: We're providing a free* service to you, we're not going to stick our neck out and risk ANY liability of being blamed of storing/harboring/distributing abusive content... we would rather err way over on the side of insane caution and let law enforcement sort it out.
This might be the final push I needed to migrate off of Google services. It's been all too convenient to have a one stop shop for everything, but I couldn't imagine my rage if I lost all of my child's pictures because Google decided that the picture of their first bath (no genitals or face in frame) was too risky.
What's recommended for a domain registrar to move my domains over to?
I figure I can probably self-host photo/file backups, move 2FA to Bitwarden, and migrate mail over to a paid Protonmail plan, but who can I trust for domain names? Mostly just for email aliases, but a couple for some hobby websites. GoDaddy can take a hike, and I've used namecheap before but what other options are good/trusted?
Cloudflare does DNS and domain registration. They are subject to US laws but they have been very hands off and pro free speech (too much IMO, protecting abhorrent sites) but that does give more trust in this case.
This highlights the enormous risk depending on Google, and similar service providers, for email, messaging and other important services is. It isn't so much the policies, but the fact that Google will never do anything to help you when they get it wrong.
You are always only one algorithmic fuck-up away from losing access and having to spend days and weeks dealing with the consequences.
I think the only way to deal with this is through regulation. Make it as inconvenient for Google to ignore customers as it is inconvenient for customers to be ignored by their service provider when something goes wrong.
Systemic mistreatment of customers ought to have consequences of existential proportions to a company. There is no societal benefit to companies like Google getting away with behaving this poorly.
The best way to solve this is to DE-regulate. Make Google etc explicitly not responsible for proactively monitoring people's private data and then said data will remain private.
Do you think Google et-al have done a good job of keeping their noses out of your business? In fact, I would challenge you to compare privacy in the EU vs the US.
Opinions on what should be done have more credibility when based in observable reality rather than blind ideology.
> subsequent review of his account turned up a video from six months earlier that Google also considered problematic, of a young child lying in bed with an unclothed woman.
Private and secure are ambiguous terms that mean different things to different people. They're insisting its private and secure because there is some level of privacy and security.
Whether you would consider that private and secure enough, is a different story
> “I decided it was probably not worth $7,000, [to sue Google]” he said.
This is a big part of the problem, technically you have a recourse, but the cost for individuals is a barrier to justice. Organisations have a lot of freedom to act behind the cost to litigate.
Are you located in the US? If so, I would be really concerned, not in this context but about potential violation of HIPPA. All my communication with a doctor's office goes through the "private message" of the office/hospital's website because there are regulations around that. Think about it -- what if she lost her phone and for some reason the phone is unlocked or decrypted by other means?
And even if you are not located in the US, I would recommend that your wife looks up local regulation and considers alternative methods to communicate with patients.
Not in the US, but her patients, friends or acquaintances sometimes send stuff like this out of the blue. Can't really be avoided since it's unsolicited.
I doubt a doctor as the recipient would get in trouble--they have legitimate needs to look at things. And it's not a HIPAA violation if the patient sends it. It's only a HIPAA violation if the doctor sends it insecurely.
What's actually scary here is that these were newly taken photos; not existing CSAM material flagged by hash value. That means Google is doing real time image recognition on all of your photos. And that means Google has an ML model somewhere trained on millions of pictures of.... yeah this is fucked up.
While historical approaches to finding this content have relied exclusively on matching against hashes of known CSAM, the classifier keeps up with offenders by also targeting content that has not been previously confirmed as CSAM.[1]
I have a larger issue that no one addressed. There has to be some type of special software for the medical profession that allows you to take a picture on your phone that is not stored on your phone and that you send to the doctor.
> There has to be some type of special software for the medical profession that allows you to take a picture on your phone that is not stored on your phone and that you send to the doctor.
No, you shouldn't have to use some special software to take a picture. Taking a photo with your phone's camera app should never trigger an account suspension.
I wouldn’t want private pictures I took just for a doctor to be synced with the cloud provider. It’s not like any of the cloud providers are HIPAA compliant.
The market leader is a system called Epic. It's like the SAP of the medical industry -- universally kind of hated; but it works well enough once set up.
I'm sort of surprised you haven't encountered it yet (and that the people writing responses to your comment haven't either).
Here's a PR piece about their secure video chat feature. (They also support emailing around photos, and the app has a built in camera, but that's old news, and not in their PR blog reel)
A specific medical oriented app. I’ve had a few virtual appointments where they ask me do I have an iPhone for FaceTime. I do. But what do they do for Android users? Which one of the ten soon to be discontinued Google video conferencing apps would the doctor use?
I’ve heard too many things about how non existent Google’s customer service is to ever trust them for anything critical. You hardly ever hear stories like this from any of the other big tech companies. I’m excluding FB, because I don’t care about FB enough.
So I can walk into one of 272 Google Stores in the US and talk to a real person about any Google product I pay for like I can with Apple?
I can depend on Google to support a Google branded phone I bought in 2013 with security updates 8 years later like an iPhone 5s user could or with operating system updates for a phone I bought in 2015?
The original poster couldn’t get anyone on the line about a Google account. You can call an Apple CSR about an iCloud issue.
Reporting the images to law enforcement is good. There should be a human in the loop to separate medical images from exploitative ones.
Perma-deleting his account on an automated accusation is bad. That should hinge on, at minimum, law enforcement's decision to charge a crime. [Edit: unless the criminality of the images is obvious - again, a human needs to be in the loop.]
> There should be a human in the loop to separate medical images from exploitative ones.
No, there really should not. I would not want a facebook employee to look at my pictures. I don't use their services, but the thought is pretty off-putting. The idea that these companies have to police content is what is wrong.
There are other ways to get to offenders here. An environment that takes good care of kids will spot it. Not some poor fella that needs to look at private images.
Perma-deleting the account is destruction of evidence, so even if the criminality is obvious, an account lock makes more sense.
Even an account lock is probably a bad idea; it alerts the pedophile that they're under investigation, allowing them to destroy evidence, cut ties with coconspirators, etc.
Best to let law enforcement deal with it. In this case, assuming it somehow went to trial, the jury would almost certainly acquit, and the account would be restored.
There is the matter of the accused losing access to the account while the case was active though. That's potentially a big deal.
> “I decided it was probably not worth $7,000,” he said.
lol. Missing 4 zeros there.
Part of the reason for the brazen actions of companies like Google is that their substantial financial means and legal department sizes grant them a substantial degree of immunity to judicial review.
Also some execs behind bars won't be a bad idea.
Granted that mistakes do happen, but when they don't resolve it in a timely manner, punishment is fair.
Addendum : I don't know if companies have a govt imposed rule to report porn, in that case I'd say the root cause is the govt, not the company. Of course if the root cause is govt, and even more root cause is people themselves. People collectively generally get the govt they deserve...
> I don't know if companies have a govt imposed rule to report porn
In the US there isn't and generally cannot be a government requirement to search the customer's data. If there were, the provider would effectively acting as an agent of the government and the customer would enjoy 4th amendment protection of their privacy (absent a warrant or other, similarly targeted and justified reason).
Unfortunately, there is a bit of a wink-nod situation going on where the government quietly pressures companies to engage in these activities "voluntarily" -- in exchange for varrious forms of preferential treatment and refraining from enforcing other regulations -- and in court when a target attempts to present 4th amendment defenses everyone pretends (and testifies) that the provider was searching the customers private files of their own volition and not on the government's behalf.
In this game neither the provider nor the governments hands are clean because they are both conspiring to undermine the constitutional rights of the public.
It's long past time to break up every last Google server into fine dust particles, to be buried in a deep mineshaft with "this is not a place of honor" style warnings placed above.
controversial opinion: as much as everybody knows that China isn't exactly championing the western way of life and western democratic standards, I keep my private files in a Chinese cloud (backups are kept private in a NAS in my house).
Why?
Because they are not in contact with our authorities and, frankly, the chances my private files will be of any interest for Chinese authorities are close to zero.
Not that I have nothing in particular to hide, but as this example proves once again, if life damaging mistakes can happen, they will happen.
In a recent comment thread I noted that my father’s generation went from fighting a bitter war with Vietnam to Apple building MacBooks there. My grandfather landed on D-Day and drove Volkswagen Beetles for most of his life.
None of us know what will happen in the future. China could become a close ally. They’re already an existential economic partner.
The only path to real privacy is personal sovereignty. If you don’t control the data it is public. Period.
I am your of the same generation of your father...
My grandfather was already 40 years old when D-Day landed.
> None of us know what will happen in the future
It's safe to assume that it doesn't matter.
You could die tomorrow, so why are you worrying?
> If you don’t control the data it is public
Unless the network is firewalled by Chinese government...
Safety is not about paranoia, but about layers.
car alarms aren't there to make it impossible to steal your car, but only to make it inconvenient for the thief and convince them to steal someone else's car.
> Because they are not in contact with our authorities and, frankly, the chances my private files will be of any interest for Chinese authorities are close to zero.
The problems that befell the fellow in this story were not due to Google being in contact with the authorities. The authorities unobtrusively investigated, determined that the reports were false positives, and closed the case.
If all Google had done was contact authorities he would have never even known that he was investigated, and there would have been no impact at all on his life.
China has bans in most of the same categories that the US and other western countries do, but typically broader (e.g., they broadly ban pornography). If ISPs there are on the lookout for things China bans you are probably more likely to have a false positive there than with a western ISP.
The question then is a Chinese ISP more likely to overreact on a false positive than a western ISP? I believe China is more likely to hold a business responsible for the bad acts of that businesses customers, which I expect would lead to Chinese businesses being more likely to overreact.
This is the same situation with "private" search engines.
Your search query is ironically less likely to be shared with government if you are using Yandex, than with DuckDuckGo that is hosted by Microsoft (before it was even Amazon).
At least is an entire Country that will blackmail me, must think I am really important, not some rando that hacked iCloud to find celebrities boobs and post them online...
> Just encrypt your data rclone would do that.
yeah, but rclone is an offline backup, basically.
cloud storage is for when you need immediate access and search capabilities.
Please explain why this is different from traveling to any other country.
I haven't been to the US since they made it legal for custom officers to search travelers' personal electronics without a warrant and deny entry if you refuse, because, thanks but no thanks.
That's an incredibly bad idea, you are exposing yourself to a lot more risk, than using a western cloud.
They are not in contact with our authorities, until they decide they want to. It's not like you are a Chinese factory owner making counterfeit wranglers, I doubt they would deny any kind of request from a western government about a westerner.
Your access to the service is also under risk, at any time there could be a breakdown of relations leaving you unable to pay for the service, that would have happened already, if you had chosen Russia instead of China.
Your behavior also looks suspicious to the western intelligence apparatus. Sending potentially terabytes to Chinese servers as a private individual may very well put you on their radar.
As others have noted you are setting yourself up as a prime candidate for an intelligence asset, they could at any point blackmail you to perform any action they want.
With what would they blackmail you? The terrabytes of CSAM they could at any point plant in your account. Do you think they would be above doing that, if they had anything to gain and were aware that you exist? Or do you think your Chinese provider would require a court order to give you up? Your entire bet is that they don't know that you exist.
My main point as advice to others mostly is you shouldn't put your self in the hands of your adversary.
I don't even know how you trust their software to run on your system.
TBH my worry in this sort of setup is China cutting you off. Say tensions increase and the CCP throws down a decree that says all Chinese sites must block themselves from being reached by the US/Western Countries.
Given this story, you’d expect Google to make a yearly report saying that they successfully threw X number of pedophiles off their services, and the FBI convicted Y% of them. You’d think it would be something they and the government would love to crow about. But they don’t. Why?
> A Google spokeswoman said the company stands by its decisions, even though law enforcement cleared the two men.
Wow. Just wow. This is worse than the usual Google's automated screw-ups. In this case, Google was notified of the issue by the NYT. Yet they actively chose to continue to screw over their victims just because they can.
> In a statement, Google said, “Child sexual abuse material is abhorrent and we’re committed to preventing the spread of it on our platforms.”
Just how tone deaf can Google be, continuing to treat these innocent folks as criminals in this passive aggressive statement even after being proven wrong? Do these people have no empathy at all?
I suppose it’s defensive behavior. If they admit their mistake now then they could potentially be liable for the damages caused by their mistake years ago. Now any lawsuit would need to determine if there was an error and harm instead of just quantifying the harm.
I’d like to contribute to a crowdsource fund to prosecute cases like this.
When I was a kid the Comic Book Legal Defense Fund [0] was set up to pay for lawyers to defend comic book stores that were being targeted by over eager police departments and civil suits.
Maybe something like the Google is an Asshole Legal Defense Fund could collect donations. The article mentions $7000 as the cost to prosecute this persons case. Crowdsourcing can help with that.
I am not saying it is right, but to a large degree this is the cost that some of us 'pay' for millions having 'free' Gmail/GDrive etc. Fully automated processes that close accounts, no due process to get them timely reinstated when the machine made an error.
You are correct, if they admit a mistake here, it will open the doors to lots of claims. I sometimes think they could a lot of people to pay for the service (with $ not just having their digital lives being harvested) if they knew to be treated better when something like this happens.
The question everyone needs to ask themselves, if Google closed your account right now, for good - what would that do to your life...
7000$ is a pitance. Maybe this case is simple, but many will not be. Say they raid a house and confiscate a hard drive. Encrypted or not, that is going to be a huge thing. Arguments will be made about whether anything incriminating was stored on that drive. Just google the cost of a forensic expert witness. Both sides will need one.
Such costs are actually why so many police agencies are backing off of CP investigations. They still prosecute where evidence is clear, such as when someone emails such material openly, but they arent willing to invest the tens and hundreds of thousands of dollars necessary to handle the complex cases involving encrypted communication/storage. 7000$ would be a bare minimum for only the simplest of legal defenses in the simplest of cases.
The article mentions two independent instances of this process within Google, where appeal is not possible even with a police report that completely exonerates the suspect.
It sounds to me as if a class action lawsuit is the most appropriate remedy for the unfortunates who are caught in this predicament. Their only problem is finding each other.
For the rest of us, it is unwise to use cloud storage for photos, for several reasons.
They don’t block CSAM because “it’s illegal” - in fact, they can’t be forced to do it without it breaking your 4th amendment rights. Instead, all CSAM reporting and blocking is done at-will by these companies, and some don’t participate (Apple[0]), so it’s a policy decision by these companies.
I imagine unblocking someone due to them being exonerated by a government entity is legally risky - perhaps doing so would be considered enough proof/evidence to deem the entire CSAM scanning practice as a search/seizure at the request of the government.
0: https://www.hackerfactor.com/blog/index.php?/archives/929-On... • “ According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.”
To be fair, users of many internet services exposed themselves. The warnings about that were loud and clear. I don't know if using their free service counts as a formal business relationship, that might make recourse more difficult. I think the service provider has the right to close any and all relationships unilaterally.
What's the incentive for Google to EVER give the accounts back? If they wrongly deactivate an account (like here), you get a bad article, EFF and friends ruminate about your behavior, and the world mostly moves on.
If Google wrongly gives an account back, you get a different article: "Google helped child pornographer even after discovering CP in their account". Now that gets attention. That's a scandal that leads to political action, criminal charges, etc.
To be clear, I'm not advocating for how Google behaves. They're a lot more like a utility and probably should be treated like one (alongside the protections and requirements that come of a utility, you don't hear "eletric company stopped serving house since man suspect of CP lives there").
For the responses saying "Well the police cleared them", again I don't disagree. But if you're an executive making this decision you're thinking:
1. We never give back an account in this case and avoid the massive downside risk
2. We go through a lot of work to design a process that will impact a marginal portion of customers and really really hope nobody manages to social engineer themselves past, and pray that no enterprising news outlet/politician tries to make the "Google helped CP person recover their CP story" - they already have a target on their back.
In what world would Google receive criticism for giving back accounts to people who has been proven innocent?
Google's surveillance system and automated ban hammers are already bad enough. But the actions they took following the ban in this case is egregious and 100% indefensible. At the very least, Google could've reinstated their victim's account and issued a full sincere apology upon being contacted by the NYT. If Google has any care for their users, they'd do that for every people they wrongly reported who had their names cleared. Instead, Google doubled down, continued to treat their victims as criminals in their statement, and even leaked details about intimate photos in a blatant attempt to discredit the users they wronged.
No parent should ever have to go through what Google has put them through when trying to cure their child. Most of all, they should never have to risk losing custody of their child because their child went sick. They should never lose access to their whole digital identity because they didn't know any better than to rely on Google. Yet this is what Google did to these parents, full stop.
If due process was followed, and the police / state exonerated the parents, I don’t think anyone would blame Google for reactivating the accounts. They’d blame the police or whatever flawed exoneration process was used.
At least, that’s what I’d hope.
Google here looks even worse than I thought possible, and I’m a de-Googled, anti-fan, so I already had a very dim view of them.
Employees of this office are very small and delicate, deserve protection from local pervs. Better a thousand innocent men are locked up than one guilty man roam free.
Google is not anything like a “utility”. A utility has a natural monopoly because of the effort, expense and disruption that is required to lay down the infrastructure and the need for scale. There is no product that Google has that you can’t and shouldn’t pay for a competitors product.
With the flick of a switch, they can deploy this technology to a global scale and way beyond protecting children. Imagine the feature propagating to their Chrome browser or their smart speakers "listening in" on what's happening in your home to "prevent" crimes by sending the cops whenever you raise your voice or say "the wrong things." This kind of power should not be in the hands of a single company.
And you can’t even escape these companies by moving to another country. They have their tendrils everywhere, except in places that consciously prevent them - which is usually done by even worse tyrannies like China/Russia.
Always makes me laugh when I see employees of Google, Amazon etc. claiming to be “anti-fascist” and “standing in solidarity with the common man” etc etc…
This is at best authoritarianism or corportatism not fascism. There is no ultranationalism, there is no "othering", there is no enforced hierarchy of individuals.
Nope, the final call was made by NCMEC. The article is a bad one, didn't highlight how the process works. And it's a federal law that google is obliged to follow.
That's not what fascism is. Fascism is hyper nationalism where everything is done in service of the nation state. Private entities only exist insofar as they are extensions of the state.
What you're describing here is more like a cyberpunk corporatocracy, where corporations hold so much power independent of the state, that they are able to exercise their own arbitrary decisions extrajudicially, while still maintaining so much power over people so as to completely control their lives.
In fact, here you can see that the person was exonerated by the state but punished by the corporation. In fascism, nothing supercedes the state.
> Just how tone deaf can Google be, continuing to treat these innocent folks as criminals in this passive aggressive statement even after being proven wrong? Do these people have no empathy at all?
They don't care a single bit about the effect their actions have on others. They only care about having to build a system, which can distinguish such cases from actually criminal ones. Because that wouldn't scale and would be bad for business $$$. So they try to turn and twist the image in the light of the public, that it is "right" what they did, so that the public does not cry out and demand change of their systems. Empathy doesn't even enter the equation for Google.
> They only care about having to build a system, which can distinguish such cases from actually criminal ones
There are only two ways to actually do that:
1) Make Google's policies 100% subservient to the United States legal system, which would look a lot like the "corporate / national lock-step unity" one sees in actual fascism
2) Google build its own court system, independent from the United States court system but with equivalent power
I’m amused because it’s a microcosm of the actual problem: it’s probably the default response to anything that has to do with child sexual abuse material, given without thought to context or circumstances, with too little review. But hey, I guess it’s Google’s official position that this dad is a child pornographer ¯\_(ツ)_/¯
> In a statement, Google said, “Child sexual abuse material is abhorrent
I love the use of the disclaimer "sexual" here, to make it clear they don't care about other types of child abuse (like interfering with access to health care, which Google is clearly guilty of in this case...)
I've said it before and I'll say it again: Google now has enough power that it has effectively turned into a globalist government, a government you did not vote for.
What do you think is the ratio of innocent to nefarious pictures of naked children Google encounters in aggregate?
This is relevant to how outraged one should be by this story. I think it is probably > 1:100000. As such, probably not much outrage is warranted, although it’s obviously not great for this one guy.
Wow, I'd guess the opposite when we consider the base rate. Seems like a classic Bayesian problem.
It's fairly normal for parents to take pictures of their children naked in the bathtub/at the beach/ camping/etc.
Conversely, I'd expect actual pedophiles and CSAM producers to be really quite rare.
So even a relatively low base-rate of normal parents with normal nude photos would likely dwarf CSAM upon detection.
So, if we say 1/100 are pedophiles, and 30/100 are parents, and of the parents 10% have such photos, the ratio I'd expect without getting into detection rates is like 3:1 in favor of normal parents.
If it wasn't clear, people aren't objecting to the use of automated tools to prevent crime. It's that there is absolutely no avenue of appeal or review against it even if the law enforcement exonerates them and a big news media reports it.
Google has inserted itself into almost all spheres of digital life by hook or crook. It's practically difficult to avoid them in many services - especially email. And now they play judge, jury and executioner. I don't understand any of these are acceptable, much less justifiable. The old argument 'think of the kids' used to justify digital authoritarianism is such a cliché by now.
I’d put pretty good money on almost every family having pictures of their naked kids doing some shenanigans. I know I have those pictures. My parents have those pictures of me and my sister. Quite a few, as I seemed to enjoy trying to run about naked…
Bill Waterson somehow managed to sneak watercolor paintings of a naked little boy into every major newspaper under the guise of being a “comic strip”— the perv.
Somehow I think there more parents who sometimes need to take a photo of their naked children than there are paedophiles.
Or at least the ratio is clearly not 1:100000, maybe more like closer to 1:10.
You would need statistics how many times google have reported police and how many times it have turned out to be a false alarm. Does google even keep record of false alarms? Most likely they don't to avoid responsibility.
Well as a data point I have pictures of my children naked. As another data point my parents have photos of me as a child naked, and as a third data point my grandparents have photos of themselves as children naked. Whereas I don't knowingly know any paedophiles.
What are the ways to mitigate these issues for someone that wants a lot of the features of Google Photos? It seems that Amazon Photos is basically copying Google Photos and has a lot of the features. I wouldn't care if my Amazon account was closed down. And it is free if you are already a Prime member.
Telehealth had nothing to do with it other than causing the picture to be taken. The picture was taken with a Google-linked phone, the AI flagged it as CSAM. The transfer to the pediatrician was probably secure and not seen by Sauron's Eye.
>Mark’s wife grabbed her husband’s phone and texted a few high-quality close-ups of their son’s groin area to her iPhone so she could upload them to the health care provider’s messaging system.
It sounds like it was probably the texting (maybe via the Google Messages app?) that got the images flagged, rather than the telehealth system.
> When Mark’s and Cassio’s photos were automatically uploaded from their phones to Google’s servers, this technology flagged them. Jon Callas of the E.F.F. called the scanning intrusive, saying a family photo album on someone’s personal device should be a “private sphere.” (A Google spokeswoman said the company scans only when an “affirmative action” is taken by a user; that includes when the user’s phone backs up photos to the company’s cloud.)
I assume it was triggered when the photos were backed up to Google Photos based on the above quote.
..and with CSAM Apple is going to get in the same business. With this latest security update I would not be surprised if CSAM has already been deployed.
Because why on earth would you oppose protecting children? /s
This is why I don't have a Dropbox account anymore.
I am extremely fortunate that the account that was deleted without recourse only contained data I had copies of on my hard drive, and to my knowledge law enforcement isn't involved.
The article fails to mention the stress and trauma of being accused of having CSAM. That remains to this day ... I'm posting from an alt because even the false accusation carries a potentially career and family destroying stigma.
What if someone emails stock photos to Google execs that are known to trigger Google's child abuse algorithm? They would have to build a way to re-activate banned accounts to get their own accounts back.
> Mark and his wife gave no thought to the tech giants that made this quick capture and exchange of digital data possible
Well... here we are, normal people don't think it's possible to transfer an image over the internet without a megacorp being in the middle of it. Pretty strong sign something has gone wrong.
"Apple announced plans last year to scan the iCloud for known sexually abusive depictions of children, but the rollout was delayed indefinitely after resistance from privacy groups."
We have a direct primary care doctor for our children and would never send a photo with genitalia in it. Either she comes for a house call or we come to her. This article confirms my fear.
They all embed themselves deeply into our communications, for not-so-altruistic purposes--allegedly to "serve us better", realistically to train the shit out of their AIs in the hopes of growing (or at least maintaining) market share. If people weren't such cattle, a hard line would have already been drawn. If...
Just don't make knives. Period. The reasoning doesn't matter. The innocent tool that you use for cooking could be stolen, could be lost, could end up donated to a thrift store where anyone could buy or shoplift it, and the knife could end up in the hands of a serial killer.
Your intent doesn't make it right. And you have to make sure a criminal can never get their hands on one.
What's next, actual serial killers declare themselves chefs and thus can receive and share knives with other chefs "for cooking reasons"?
Please. Don't be so naive guys. Don't make knives. How the hell is that not common sense?
---
That is how rediculous your suggestion about pictures sounds to other people.
Yeah, just delay seeking medical treatment (for your child) until this coronavirus thing blows over.
The context is critical. Context is always critical.
Also, I'm not sure why an "actual" paedophile couldn't be an acupuncture professional. I mean it's not really a thing to take your kids to before the GP. To be clear: I don't think that anyone should be able to just declare themself to be some sort of qualified practitioner and make or imply that they can provide a service or results that they cannot. But I'm really unsure why someone calling themselves an acupuncture professional (whether they are qualified to or not) would entitle them to freely trade in sexual abuse material.
It doesn’t seem right for the doctor to ask the parents to take pictures then send them over SMS, email, or whatever they asked them to use. Why wouldn’t this just be done within the privacy of the doctor’s office?
You shouldn't have to travel to a doctor's office to get privacy. There is nothing wrong with a parent or a doctor taking pictures of a medical condition (rashes, etc).
It is beyond me that someone would use email to submit sensitive information. Pandemic aside, you should know better.
Also, I am sorry this happened. It is very human to respond to a person in authority - but we need to be better and start asking questions. It is our privacy at stake.
Hopefully everyone learns from this; Also, Google was doing the right things.
Where does it say that they used email to submit the information?
And I really don't see how Google insisting that banning them was the right thing to do and being cleared by police doesn't warrant undoing the ban is "doing the right things".
https://archive.ph/tL6wk
From my travels, my impression has been that America in particular treats child nudity as completely, unexceptionally obscene, beyond even adult nudity.
Compared to a beach in Europe, where nearly half of children under 2 run naked, there seems to be no grey area or minimum acceptability in the US.
It makes me wonder if our hypersexualized treatment of child nudity /actively contributes/ to the sexualization of children in our culture.
Yet all these upload filters have been pushed by the European Commission... Otherwise Google may have never adopted them. Current chapter is "chat control". https://www.patrick-breyer.de/en/posts/messaging-and-chat-co...
It was the US industry lobbying at the EC (which seemed to be admittedly an easy target) [1]
[1] https://netzpolitik.org/2022/dude-wheres-my-privacy-how-a-ho...
Haha yeah, in Europe there are often naked children running around at the beach. In Germany, there are places (especially lakes) where everyone is walking around, swimming and sunbathing naked - Sometimes the whole family. For me it was quite a shock initially but now I think it's fine.
I find that a bit weird tbh. I mean, if we look at human tribes throughout the world/history, and even what is the most common nowadays in african tribes which maintain their dressing habits, they usually cover their genitals. It indicates to me it is human nature to do that. It seems to me, in some countries for several reasons, they are going against the grain, purely out of culture.
But there's something weird about it. Just like when you look at previous periods in time and now they look weird, often just reactions to previous epochs or other cultures (like the neoclacissim came as an opposition to rococo and baroque).
This feels like it. It doesn't feel natural. It feels like it is purely out of a contrarian way of thinking, about how forward thinking they are, "look how superior we are that we are completely void of instinct to cover ourselves". and perhaps somewhat of a sexual counter balance to the strictness of the rest of their culture. I'm not sure i'm explaining this in the best way. I think there is a natural middle ground, the one that we have seen throughout hunter gathere human history, and Nordic countries are just being culturally weird by going to one extreme.
3 replies →
People naked: ok. Pictures: nok.
4 replies →
I don't think it's quite as strict as you're making it out to be.
I was in a park in Manhattan last week, which had a bunch of big sprinklers for kids to run through. No one was naked in the sprinkler, but some parents helped their children change in/out of their bathing suits out in the open.
(Then again, I only remember this because I was a tad surprised by it.)
in japan, children play a game called Kancho, where they put their index fingers together and try to insert it into the anus of their unsuspecting friend.
in america, you could destroy someone's life over that. (possibly even multiple lives)
15 replies →
A different article recently prompted me to wonder (I'm American): We know exactly at what age someone becomes 'legal' wrt images - do we know on which day of their life a person first becomes 'illegal'?
There is no such day. It's about the context of the images. Nude images of all ages are completely legal in the US.
You have the logic backwards, it's sexual images that have an age, not nude images.
In the us, there is an unspoken belief that preventing criminals from ""winning"" is more important than protecting citizens.
This is why police fired into a stolen minivan that had a unrelated child and a shoplifting/carjacking suspect in it, killing both. Protecting the child took a back seat to punishing the prep.
Therefore CP laws very quickly stopped being about protecting the children and more about punishing the pedophiles.
Vindictiveness and spite are the unspoken role models of the US justice system.
Nudity in kids is seen as obscene purely because a pedophile could find it arousing. We are preemptively sexualing children in order to prevent their sexualiztion.
A child was once forced to masturbate in front of a camera by COURT ORDER. Police held them down. All because the child was a victim of child pornography and the court wanted a "comparison image" in a similar arousal state to prove it was a image of that child.
Once again, a child was victimized by the justice system, in order to punish a perp.
Never make the mistake of thinking anybody in the justice system gives a rats ass about protecting anyone.
> A child was once forced to masturbate in front of a camera by COURT ORDER. Police held them down. All because the child was a victim of child pornography and the court wanted a "comparison image" in a similar arousal state to prove it was a image of that child.
That summary is misleading and, I feel, wrong. It's not accurate to state that he was "a victim of child pornography," the complaint in this case was against him by the parent of another minor that he was "sexting" with.
That doesn't excuse the police behavior here, but you're attempting to paint a picture where the victims in these cases are outright ignored in a misguided search for justice. You're twisting this case[1] to fit your narrative, I think.
Further, he sued the government and won. The courts made it perfectly clear, the lower courts and police were absolutely in violation of this teens rights when it granted and executed this search warrant. So egregiously that "qualified immunity" doesn't even apply to the officers estate.
I wasn't going to address this, but..
> Therefore CP laws very quickly stopped being about protecting the children and more about punishing the pedophiles.
Pedophiles create a market through demand. Often it is also pedophiles that are on the supply side of this market, but not always. Merely participating in the demand side implicates you in these crimes against children as you are suborning their abuse. One could say that our system merely recognizes this fact.
[1]: https://arstechnica.com/tech-policy/2017/12/forcing-kid-to-m...
3 replies →
> In the us, there is an unspoken belief that preventing criminals from ""winning"" is more important than protecting citizens.
It's sad you are downvoted for this, because it sums up the American mentality perfectly. Some people really can't handle the truth.
This is a tall pile of claims without citations.
> Not only did he lose emails, contact information for friends and former colleagues, and documentation of his son’s first years of life, his Google Fi account shut down, meaning he had to get a new phone number with another carrier. Without access to his old phone number and email address, he couldn’t get the security codes he needed to sign in to other internet accounts, locking him out of much of his digital life.
> “The more eggs you have in one basket, the more likely the basket is to break,” he said.
I only have a google account to have access to Play store. Everything else, mail, calendar, photos and storage, has been moved off Google
Same here. Those shitty providers are a too high risk.
Many years ago Microsoft has banned my outlook account for no reason. I tried contacting the support to resolve the issue - nothing. Only after writing a letter, I received the answer from Microsoft Germany:
We banned your account because I was distributing porn through onedrive (which I was not doing). The investigation showed that Microsoft could not find any wrongdoing. The account was gone nevertheless.
This is why I say:
- skip Gmail/Outlook and all this crap, use something proper like mailbox.org with your own domain - dont trust cloud. Backup once a month, use two clouds at the same time, have a cold storage with your backups - biggest risk is your apple icloud account. dont be dependant on them: dont use their credit card - dont use google/ms 2fa apps - have multiple accounts in parallel: one for google drive, one for search, one for mail (if you need it), one for youtube, etc
Isn't the ability to migrate phone numbers a legal requirement for carriers?
Edit: yeah it is [1]
> Phone companies are required by law to port your number out when you start service with a new carrier. According to the FCC, a company can't refuse to port your number even if you have an outstanding balance or unpaid termination fees.
[1] https://telzio.com/blog/what-to-do-when-a-carrier-refuses-to...
When porting to a new carrier, the new carrier asks for your account information and PIN for the old carrier. The new carrier provides that to the old carrier to prove that the port request came from you.
It looks like with Google you are supposed use the Google Fi app or website to obtain a “port out number” and PIN [1] to give the new carrier when they ask for account number and PIN.
If you accounts have been banned I wonder if you can still get that information?
[1] https://support.google.com/fi/answer/10888419?hl=en
2 replies →
This is why I’m something of a two factor authentication Luddite. I (think) I understand the benefits - but the fact that every factor is an additional single point of failure freaks me out.
TOTP keys, backed up physically (as in, written down and stored somewhere safe), are probably a good way to balance the failure mode with the security.
1 reply →
You have to backup to 2FA keys. That's a must.
> The more eggs you have in one basket, the more likely the basket is to break
This. Use Yahoo for your email, Verizon for your phone number, Dropbox for your backups, Facebook for your socials… It doesn’t matter which you choose, but fragment your digital life. When one part goes bad, it won’t infect the others.
Honesty, I wish self hosting was easy enough for ordinary people to do. Then there would be no reason to recommend people to fragment their digital life among dozens of ad-funded SaaS platforms. There is no reason why it has to be so hard. But not enough effort has gone into developing such systems due to obvious reasons. Sandstorm project, for example pioneered a model where users could deploy web apps like they would install mobile apps on their phone. Sadly, it didn't achieve the level of popularity it needed.
2 replies →
I'm still totally not off Google yet but this is the direction I'm going in. Fastmail for email (still need to do the POP/IMAP setup for local caching using Thunderbird), local ISP for phone number, Syncthing for syncing across PCs/Android phones (except iOS which lives in its own world) and Facebook has been deleted long ago. Just takes a while to get off Google because of years of sign-ups but I already got the financial and government services off it.
I agree and have completely extracted myself from their ecosystem, but this isn't a reasonable expectation to push on the general public.
When I started using Google, their motto was still "Don't be evil." My entire business and personal life runs on it, so I have no way to switch. 300 employees emails, all documents, videos, tons of custom scripts would have to change over.
One solution to many of these concerns is a periodic backup (via Takeout) of your Google account.
I have tried this many times, and literally the only time it succeeded was when I had it exported to Google Drive.
It feels like they built it to say that you can, but your data is still just as locked as before…
3 replies →
Do you have any suggestions for open-source Android backup solutions that can back up everything? Because I'm having a hard time finding any.
4 replies →
I'd be interested in automated backups/sync as well. Sounds like something like https://github.com/rtomac/cloud-services-backup-cli would do the trick...
This is not a "solution" - it just mitigates a portion of the disruption.
How does that help you get access to sites that you secured with Google MFA that you are now locked out of?
How do you automate this for daily backups?
3 replies →
I moved that off too. Aurora Store in anonymous mode works great and also shows privacy info of each app
The biggest problem with Play store is that many apps are geographically restricted. That includes banking and public transport apps. Some apps are restricted without even the knowledge of the developer. Most of those apps have no good reason at all to be restricted like that. Such apps often cannot be accessed through Aurora. Sadly, there are no laws or provisions to challenge such unethical tactics. Using Google's services is like living with a manipulative and abusive partner.
2 replies →
Moved where? Not to iCloud, I hope?
Self-hosting mail, calendar and media is too complex task for 99% of population. This is not a solution.
Nowadays you can buy a NAS and plug it into your router and then click the install nextcloud option and you have it running, then you add the app on your phone. Its no where near as difficult to get these level of capability as it used to now that Docker is integrated in a user friendly way in a lot of NAS software.
6 replies →
Someone needs to put together a single-click Docker-Compose-type solution that will work on any Digital-Ocean-type shared host.
Proton, Hetzner Nextcloud
3 replies →
The biggest failure here is Google's continued refusal to segment their services and this has been a problem since at least the Google+ days, which is more than a decade ago at this point.
During Google+, Google controversially instituted a "real name" policy. This was controversial and completely driven by Vic Gundotra who would get up at company meetings when asked about this (back when he still answered questions because believe me that ended) and said "we don't want people named 'dog fart'".
Legitimate concerns that people might have for their safety were completely brushed aside.
Anyway, the enforcement for this of course was automatic and resulted in I'm sure many false positives. But what happened when your account was flagged/ You lose access to everything.
This too was criticized at the time and people asked "well if it's a policy violation for Google+, why do people lose their Gmail?". These questions too were completely brushed off.
At this time I decided I simply couldn't and wouldn't use any other Google service because my Gmail is too important to risk by an automatic ban from a false positive on a completely unrelated product.
And here we are, a decade later, with the exact same nonsense happening.
Now of course the CSAM ban itself is ridiculous. Refusing to reverse it is ridiculous. All of that is true. But don't tie your email or your phone to a company's other services when they do blanket bans like this.
Disclaimer: Xoogler.
What I dont understand is why the stupid policies established by a coupled of troubled individuals years ago for failed projects are still in use today ...
Give us de +Word back... dont "improve" the stupid quotes.
This story should serve as a warning and a motivation to move away from iCloud, Google Cloud, and alike for video and photo storage. The easiest thing is to just purchase a drive or NAS and store your media on it. If you're a bit tech-savvy, you can run your own NextCloud. I run NextCloud on Hetzner, 1TB storage box + web server for a total of $7 a month. Or you can get Hetzner to run the NextCloud for you for about $4.7 a month for 1TB storage, but then you don't have full control over it. Then there are quite nice NextCloud apps for Android and iOS that you can configure to sync your photos and videos into your cloud.
> 1TB storage box
hah, we're a hetzner costumer for .. years now, but I had no idea they have this :D
https://www.hetzner.com/storage/storage-box
so you can connect the NextCloud to the storage account? which protocol do you use for this?
Not OP, but I also use Hetzner and their storage boxes. Servers I use will connect using samba. For temporary access from other devices I usually use FTPS, but they also offer a bunch of other protocols.
It's a limited SSH, so you can use rsync and Borg for backup. You can also use Samba to connect to it. So I think nextcloud would work with samba.
Can NextCloud do OCR on the pictures, identify people and allow search by objects in the photo? Because those are the killer features for me that make it super useful to use Google photos.
Yes. https://github.com/marcelklehr/recognize I run Nextcloud on a Raspberry PI running yunohost.org for all my family.
Sort of...not well. There is an app for NextCloud called Recognize [1].
It has gotten significantly better in the last year or so, but it's still pretty bad and pretty slow.
[1] https://apps.nextcloud.com/apps/recognize
You might want to check out something like https://photoprism.app/
1 reply →
No you see that requires work when most people would rather complain.
It's such a shame because having these services be healthy and popular benefits everyone.
I am a software professional and run my own email and it's a massive pain in the ass. I started also using a google account just to use for all the stupid sign ups the internet makes you do, and use my real email for actual correspondence, banking, and a few other things only.
I worked in a 1 Hour Photo when I was in college. The standing rule we had is that any photos of naked adults were not printed (the customer was given their negatives though). In the case of naked children if it was the typical "kid taking their first bath" it was fine but if there was any doubt we had a manager review it. I think we had to call the cops a couple times (more for passing the buck than making an official decision like that ourselves) but there is/was a policy for things like that.
The point is that there were 2-3 layers of humans applying their best judgement and there was a process in place.
My dad as a teenager worked in one of these photo places with such a policy. Only, his boss printed all the naked pictures anyway and kept them.
Yeah but did you then cut off your customers' phone lines, redirect all their mail, steal all of their previous photo albums...
Why would you not print naked adults?
Puritans.
We “banned” alcohol for 13 years.
America’s greatest battle is with our dark religious past. We claim to be secular but really it’s an aspiration.
45 replies →
One example would be protecting employees from someone just bringing in photos of themselves with the sole intent of exposing themselves to the employee, as they have a captive audience.
8 replies →
Being forced to look at people's amateur porn at your job is pretty lame.
4 replies →
The tradition at 1 Hour Photos was to keep the prints of the naked adults for yourself.
When I used to go to house parties and raves I'd take my film to a 1Hour place in San Francisco staffed by two young gay boys. They'd print anything.
I keep telling anyone who will listen that they need to move every aspect of their online identity away from the big tech giants, and make sure that each type of service (email, webpage, memory storage, document storage, backups, contacts, etc) is compartmentalized in such a way that if one gets removed then it wont affect the others in any way.
Imagine having all your memories in the form of images and videos taken away because of sloppy review work at a tech company.
I want to, really I do, but it feels like trading one devil for another.
Email is by far the biggest albatross around my neck. To migrate email requires me to setup two different failure points: purchase a domain and a Fastmail-like service. Now that is two different places that are subject to me forgetting to pay a bill, being social engineered into giving away my account, etc. To say nothing of the long tail of acquaintances who only know be my current address.
Yet still, this existential terror exists that I will Do Something Wrong (no you will never know what it was), and lose everything.
E-mail should be the first thing you migrate, precisely because it is the key to every other service you rely on. If you're randomly banned from your E-mail service, you're pretty much screwed. E-mail is too important to let someone else host. Move it as close to your own control as your technical ability allows. Do it today, in fact, do it right now.
1 reply →
Autopay is simple to set up with any hosting provider. If you're in a financial place where the $5/mo and $10/yr payments might bounce... yeah, probably don't pay for email.
As far as the social engineering goes, personally I gladly take that risk in order to bring my email firmly into my control. If I'm going to lose access to my email, I want it to be because I did something stupid, not because an algorithm flagged my account and Google has no humans I can appeal to.
The long tail of acquaintances isn't bad either: just forward emails from Gmail to your new address. If you lose access to the Gmail account at some point, you'd have lost everything anyway, so this arrangement would be strictly better.
4 replies →
My pictures and videos are synced to iCloud, Google Photos, and MS OneDrive. My photos are synced to Amazon Drive too.
Do you use the same email or 2FA for all of those? If so, you still have a potential single point of failure if that provider decides to ban your account.
4 replies →
That sounds sorta expensive but I get it. I’m at 3tb of photos and videos. It’s becoming unmanageable… anyone aware of an self hosted multi redundancy photo storage that doesn’t involve manual copying of large amounts of files? Or other non time consuming setups that works with iPhones (me and my SO)?
2 replies →
I just do real time photos to iCloud or google, and full backup of everything to backblaze.
This event happens so rarely that it would be better advice to tell people to wear a bullet proof vest and helmet when going outside. It’s more likely to benefit them than self hosting media.
> This event happens so rarely
Does it though?
Also: I wear a helmet or a belt every single time I drive even for 1km. Should I not? I never needed them. 0% use rate so far for me, might as well not have done it.
The issue is that when it happens, you're done.
Considering that you probably don't need to keep all your eggs in one basket, why not at least try?
On this note, I wish providers were forced to provide a third-party backup mechanism for important data so that if I were to lose:
- the email: it could forward my email to at least receive them
- my photos: a copy could be kept on multiple hostings
etc
Anecdote: It's happened twice for me. Once was for an app I published they decided was against ToS, two years after I published it. And the other was an accidental account deletion when upgrading Google Apps (luckily this one was mostly recoverable... if you know the right people at Google).
You don't need to self-host, there are plenty of alternatives for each service offered by google.
14 replies →
How do you know how often it happens?
It's completely ridiculous that Google's customer support process is basically:
1) AI bot
2) Beg HN/twitter for insider help
3) Lawyers
That's absolutely insane for a product set that has become central to one's life
HN should add a [Corp Support] tab, if Google & Co won't create their own support system, let's do it here, and add a dark pattern leaderboard for not replying/acting
I’ve heard great things about Google Domains but this kind of story is exactly why I probably won’t be using that service. It’s just too risky if you lose everything at once.
Honestly: don't upload unencrypted content to anyone, for exactly this reason.
I have cloud backups of family photos, but they're all through restic or rclone with the crypt filter applied. Privacy is about the right to put yourself in context.
> Privacy is about the right to put yourself in context.
Wow. This is a brilliant. Did you come up with this?
1 reply →
The problem with personal encryption for long term storage is that it is easy to loose private keys and passwords.
2 replies →
Sorry, 99.99999% of the general population don't know what restic or rclone is. In fact, I won't be surprised if 90% of software engineers have never heard of them. These things aren't really know outside circles like hacker news.
1 reply →
Privacy is about the right to put yourself in context.
Very well said.
Yeah, the way Google likely ties your accounts all together a wrong decision on any Google account even if not the same account for your domains could end up having all your domains stolen by Google.
What I'd like to know if if they actually deactivate multiple linked accounts when any one of them gets flagged. I have three accounts, one more for personal things, one more professional, and a third one with my current country as a location for getting local apps in the Play Store.
Google knows I am the same person, even though they are different accounts, so are the two other accounts safe when one of the three gets flagged?
3 replies →
As a basic consumer storing somewhat not important things on Google is potentially risky for the content on it but you aren't paying them anything so its probably worth the trade off. For a business or for something you pay for Google's support is atrocious and its not worth the hassle given all the horrendous failure modes it can put you in. One thing Google consistently is teaching people is do not pay them directly as they don't know how to treat their customers.
Using Google for anything at this point feels like a ‘told you so’ waiting to happen.
Not sure where you'd upload the photos to Google domains?
Assuming, of course, that you don't use your personal account for your domains - that'd be crazy!
If it’s personal domains, then it would make sense to use a personal account.
3 replies →
> Mark spoke with a lawyer about suing Google and how much it might cost.
> “I decided it was probably not worth $7,000,” he said.
I believe it is one of the roots of the problem. How is it even possible that getting justice in court in such a trivial case costs about three months of median income?
Well, it's really not straightforward to know what would be the outcome of such a lawsuit. Google's ToS is one thing but the main point is that random user uploading family photos is not a protected group, so Google refusing service is very much their right. The state usually cannot force otherwise.
The state should. Otherwise, what’s the point of having it.
1 reply →
I get "How much suing is" doesn't imply winning the case, it is just a price of getting case to court.
Have you ever hated child porn so much that you sent private medical photos of someone's naked children to multiple strangers?
If not, you obviously aren't as committed as Google about ending CSAM.
And the center for missing and exploited children is not a law enforcement agency and uses volunteers! That’s outrageous
I wonder how many pedophiles volunteer for this(or do content reviews for Google et al), and surreptitiously copy pictures they're intended to review.
The real problem here is companies are not cops and should quit acting like cops.
The instant a company has evidence of a possible crime being committed they should be required to hand the evidence over to the police and then take no other action other than preventing distributing it or the like.
This is not just Google's AI goofing up on what constitutes CSAM (and it sounds like given the witch hunt about such things that Google was being reasonable in informing the police), but colleges expelling "rapists" without evidence etc. The accused never gets anything resembling a fair trial but since it's not the government doing it that doesn't matter, there's no repercussions from messing up lives based on completely incompetent investigations.
They may not be cops but they’ve created an enabling technology. They’re also the only ones who could access the data and recognize its potential for abuse. It’s not an easy situation.
But clearly if they’re referring out to law enforcement, they need to close the loop on that and take responsibility when they get it wrong.
This is more than just Google and CSAM. We have a more general problem with companies playing cop--and generally doing a terrible job of it. This case is simply one example of the problem, we should be focusing on the bigger picture.
It's time for the people to decide how they want companies that provide utilities to behave, and time for utility companies to stop telling the people how to behave.
In the olden days, if the AT&T monopoly just cut off phone service to a (convicted in court) pedo, they would get in severe trouble. We the people imposed limits on powerful companies. Even today, with the monopoly split up, this would not be legal. Let alone just deciding on their own initiative to do it.
In this case, a utility provider is cutting off service based on a digital rumor. They are judge, jury, and executioner.
The laws governing telcos were made over a period of 150 years, but most particularly in the 1920s and 1930s.
Google does not fit these laws because they do not charge for them (perhaps this should be made illegal?) and monetize them differently. Also, obviously the services are far beyond simple voice or fax. And yet, they are definitely utilities.
Utility companies must not be politically partisan or active. Mixing those two things is toxic and bad for society. It also is too much of a temptation for politicians to use the implied power of utilities over the people to silence or supress opposition.
If Google wants to be an activist company, then it will need to shed its utilities. If Google wants to provide utilities, then it needs to shut down its activism.
Google's arrogance here is astonishing. The police say no fault, and it's the subject of an NYT investigation, and they still won't restore the account. What hope do the rest of us have?
I've been an Android user for a long time, but I think this might finally push me to switch to Apple. I'm just disgusted by this.
> What hope do the rest of us have?
The Eu starting to fine the living daylights out of Google for not allowing people get their data from their account per the new digital gatekeepers act that the big tech has 6 months to comply.
They did it to Microsoft when Microsoft refused to comply with the browser ballot box initiative, saying that it was "technically near impossible". They started fining them 500,000 Eur/day or something. Then Microsoft magically made the ballot box happen within 2 weeks.
Apparently having a "0 tolerance policy" on something means that even when an accusation is proven to be false, you'll still punish the accused. I am disgusted and Google should be ashamed.
The Apple who caused an immense controversy with its CSAM scanning debacle?
All of Big Tech is ultimately not so different.
They cancelled the CSAM scanning project. In contrast, Google has been doing CSAM (and apparently AI porn detection) for years.
Also, the bone-headed Apple plan had safeguards that would have prevented the victim in the article from losing their account.
The two policies aren't remotely comparable.
5 replies →
Also, it looks like a powerful attack vector. Just slip a questionable content onto a victim's phone - and voila, a lot of trouble is under way, probably irreversible.
Especially because you can take photos with someone's phone without locking it. All you need is brief physical access to the phone.
This looks easy critical attack method I've never come up with. Very realistic CGI shown on tablet could be used for that, instead of real CSAM.
1 reply →
You also need brief physical access to a naked child at the same time as the phone. That may be a bit more difficult to arrange.
2 replies →
"Google’s review team [then] flagged a video he made and the San Francisco Police Department had already started to investigate him."
That sounds like a permanent stain on his records.
Also a permanent stain on Google's reputation.
It will be forgotten about by tomorrow.
2 replies →
A stain on Google’s reputation? You won’t see a trace of this mild brown smudge on that layer of tar.
Google has a responsibility, to a limited degree, to turn over to law enforcement anything that they know about abuse that comes through their system. More likely, the trigger for that is set really low as a corporate CYA and to pass the buck. I can totally see Google's point of view on this: We're providing a free* service to you, we're not going to stick our neck out and risk ANY liability of being blamed of storing/harboring/distributing abusive content... we would rather err way over on the side of insane caution and let law enforcement sort it out.
2 replies →
This might be the final push I needed to migrate off of Google services. It's been all too convenient to have a one stop shop for everything, but I couldn't imagine my rage if I lost all of my child's pictures because Google decided that the picture of their first bath (no genitals or face in frame) was too risky.
What's recommended for a domain registrar to move my domains over to?
I figure I can probably self-host photo/file backups, move 2FA to Bitwarden, and migrate mail over to a paid Protonmail plan, but who can I trust for domain names? Mostly just for email aliases, but a couple for some hobby websites. GoDaddy can take a hike, and I've used namecheap before but what other options are good/trusted?
Cloudflare does DNS and domain registration. They are subject to US laws but they have been very hands off and pro free speech (too much IMO, protecting abhorrent sites) but that does give more trust in this case.
This highlights the enormous risk depending on Google, and similar service providers, for email, messaging and other important services is. It isn't so much the policies, but the fact that Google will never do anything to help you when they get it wrong.
You are always only one algorithmic fuck-up away from losing access and having to spend days and weeks dealing with the consequences.
I think the only way to deal with this is through regulation. Make it as inconvenient for Google to ignore customers as it is inconvenient for customers to be ignored by their service provider when something goes wrong.
Systemic mistreatment of customers ought to have consequences of existential proportions to a company. There is no societal benefit to companies like Google getting away with behaving this poorly.
The best way to solve this is to DE-regulate. Make Google etc explicitly not responsible for proactively monitoring people's private data and then said data will remain private.
Do you think Google et-al have done a good job of keeping their noses out of your business? In fact, I would challenge you to compare privacy in the EU vs the US.
Opinions on what should be done have more credibility when based in observable reality rather than blind ideology.
They aren’t obligated to actively scan for this material (see Apple).
> subsequent review of his account turned up a video from six months earlier that Google also considered problematic, of a young child lying in bed with an unclothed woman.
Isn’t that also a description of breastfeeding?
It's not Google's place to say what is or isn't problematic.
or, probably millions of families with a small kid. not everybody sleeps in pajamas.
also, while we might find it unwise to make a video, but also we were not there.
But they keep insisting that their services and cloud storage is private and secure? Odd.
> private and secure
the meaning of both words being defined by their provacy policy and ToS
Private to Google's tracking algorithms. Secure from competitors.
Private and secure are ambiguous terms that mean different things to different people. They're insisting its private and secure because there is some level of privacy and security.
Whether you would consider that private and secure enough, is a different story
These statements should be understood as Google's privacy, and Google's security. You're just an authorized user.
Lying to make money is hardly odd.
> “I decided it was probably not worth $7,000, [to sue Google]” he said.
This is a big part of the problem, technically you have a recourse, but the cost for individuals is a barrier to justice. Organisations have a lot of freedom to act behind the cost to litigate.
My wife is a doctor and received a few of these kind of images in whatsapp from her patients from time to time. Should she be concerned?
Are you located in the US? If so, I would be really concerned, not in this context but about potential violation of HIPPA. All my communication with a doctor's office goes through the "private message" of the office/hospital's website because there are regulations around that. Think about it -- what if she lost her phone and for some reason the phone is unlocked or decrypted by other means?
And even if you are not located in the US, I would recommend that your wife looks up local regulation and considers alternative methods to communicate with patients.
Not in the US, but her patients, friends or acquaintances sometimes send stuff like this out of the blue. Can't really be avoided since it's unsolicited.
I doubt a doctor as the recipient would get in trouble--they have legitimate needs to look at things. And it's not a HIPAA violation if the patient sends it. It's only a HIPAA violation if the doctor sends it insecurely.
WhatsApp images are supposed to be E2E-encrypted, aren't they? Not sure about the whole Google Drive backup, though.
Google photos has a mode that will auto backup all received whatsapp images. In that case she could face the same problem.
Perhaps switch to signal or telegram? Some service that guarantees confidentiality.
Instead of questioning the use of whatsapp for medical purposes by a medical professional you tell them to use signal or telegram?
5 replies →
Whatsapp and Signal are e2e encrypted. Telegram isn't.
2 replies →
What's actually scary here is that these were newly taken photos; not existing CSAM material flagged by hash value. That means Google is doing real time image recognition on all of your photos. And that means Google has an ML model somewhere trained on millions of pictures of.... yeah this is fucked up.
From a link the article:
While historical approaches to finding this content have relied exclusively on matching against hashes of known CSAM, the classifier keeps up with offenders by also targeting content that has not been previously confirmed as CSAM.[1]
[1] https://www.blog.google/around-the-globe/google-europe/using...
I have a larger issue that no one addressed. There has to be some type of special software for the medical profession that allows you to take a picture on your phone that is not stored on your phone and that you send to the doctor.
You are asking an unfortunately reasonable question. The fucked up thing is that your normal camera app should suffice for this.
It should suffice if you disable cloud sync and backups.
1 reply →
> There has to be some type of special software for the medical profession that allows you to take a picture on your phone that is not stored on your phone and that you send to the doctor.
No, you shouldn't have to use some special software to take a picture. Taking a photo with your phone's camera app should never trigger an account suspension.
I wouldn’t want private pictures I took just for a doctor to be synced with the cloud provider. It’s not like any of the cloud providers are HIPAA compliant.
The market leader is a system called Epic. It's like the SAP of the medical industry -- universally kind of hated; but it works well enough once set up.
I'm sort of surprised you haven't encountered it yet (and that the people writing responses to your comment haven't either).
Here's a PR piece about their secure video chat feature. (They also support emailing around photos, and the app has a built in camera, but that's old news, and not in their PR blog reel)
https://www.epic.com/epic/post/expanding-telehealth-during-t...
They are doing some interesting stuff with outpatient monitoring and treatment-specific workflows, apparently:
https://www.epic.com/epic/archive/epic-outcomes
You have a better chance of Google just doing the right thing than that happening.
The health sector isn't known for having a secure anything, furthermore getting health companies to agree on one thing together is almost impossible.
Snap chat?
A specific medical oriented app. I’ve had a few virtual appointments where they ask me do I have an iPhone for FaceTime. I do. But what do they do for Android users? Which one of the ten soon to be discontinued Google video conferencing apps would the doctor use?
6 replies →
I’ve heard too many things about how non existent Google’s customer service is to ever trust them for anything critical. You hardly ever hear stories like this from any of the other big tech companies. I’m excluding FB, because I don’t care about FB enough.
Can you name other big tech companies with good customer service that is purely software?
You can get customer service from Google's hardware just like Apple.
It's the software/services that have awful/non-existent customer service.
So I can walk into one of 272 Google Stores in the US and talk to a real person about any Google product I pay for like I can with Apple?
I can depend on Google to support a Google branded phone I bought in 2013 with security updates 8 years later like an iPhone 5s user could or with operating system updates for a phone I bought in 2015?
The original poster couldn’t get anyone on the line about a Google account. You can call an Apple CSR about an iCloud issue.
Reporting the images to law enforcement is good. There should be a human in the loop to separate medical images from exploitative ones.
Perma-deleting his account on an automated accusation is bad. That should hinge on, at minimum, law enforcement's decision to charge a crime. [Edit: unless the criminality of the images is obvious - again, a human needs to be in the loop.]
> Reporting the images to law enforcement is good.
citation needed.
do these CSAM scanning things actually help reduce kid exploitation?
and if they do, is this the best use of our resources?
> There should be a human in the loop to separate medical images from exploitative ones.
No, there really should not. I would not want a facebook employee to look at my pictures. I don't use their services, but the thought is pretty off-putting. The idea that these companies have to police content is what is wrong.
There are other ways to get to offenders here. An environment that takes good care of kids will spot it. Not some poor fella that needs to look at private images.
Perma-deleting the account is destruction of evidence, so even if the criminality is obvious, an account lock makes more sense.
Even an account lock is probably a bad idea; it alerts the pedophile that they're under investigation, allowing them to destroy evidence, cut ties with coconspirators, etc.
Best to let law enforcement deal with it. In this case, assuming it somehow went to trial, the jury would almost certainly acquit, and the account would be restored.
There is the matter of the accused losing access to the account while the case was active though. That's potentially a big deal.
This was a predictable outcome of the following factors:
* Criminalizing possession and creation of child pornography equivalently
* Conscripting the tech sector into detecting it via SESTA/FOSTA.
Look forward to reading stories like this ad nauseum.
You could shorten your comment to "This was a predictable outcome of conscripting the tech sector", and it would be more accurate / precise.
Child pornography is only one of many areas where false positives in legally mandated sescanners end up ruining people.
Credit card payments come to mind, though I think those are mostly self-regulation.
What's an "sescanner"?
> “I decided it was probably not worth $7,000,” he said.
lol. Missing 4 zeros there.
Part of the reason for the brazen actions of companies like Google is that their substantial financial means and legal department sizes grant them a substantial degree of immunity to judicial review.
>lol. Missing 4 zeros there.
Also some execs behind bars won't be a bad idea. Granted that mistakes do happen, but when they don't resolve it in a timely manner, punishment is fair.
Addendum : I don't know if companies have a govt imposed rule to report porn, in that case I'd say the root cause is the govt, not the company. Of course if the root cause is govt, and even more root cause is people themselves. People collectively generally get the govt they deserve...
> I don't know if companies have a govt imposed rule to report porn
In the US there isn't and generally cannot be a government requirement to search the customer's data. If there were, the provider would effectively acting as an agent of the government and the customer would enjoy 4th amendment protection of their privacy (absent a warrant or other, similarly targeted and justified reason).
Unfortunately, there is a bit of a wink-nod situation going on where the government quietly pressures companies to engage in these activities "voluntarily" -- in exchange for varrious forms of preferential treatment and refraining from enforcing other regulations -- and in court when a target attempts to present 4th amendment defenses everyone pretends (and testifies) that the provider was searching the customers private files of their own volition and not on the government's behalf.
In this game neither the provider nor the governments hands are clean because they are both conspiring to undermine the constitutional rights of the public.
Time to break up the google into smaller parts that may allow a competitive market again. Google FB Apple could all use a bit of competition
It's long past time to break up every last Google server into fine dust particles, to be buried in a deep mineshaft with "this is not a place of honor" style warnings placed above.
controversial opinion: as much as everybody knows that China isn't exactly championing the western way of life and western democratic standards, I keep my private files in a Chinese cloud (backups are kept private in a NAS in my house).
Why?
Because they are not in contact with our authorities and, frankly, the chances my private files will be of any interest for Chinese authorities are close to zero.
Not that I have nothing in particular to hide, but as this example proves once again, if life damaging mistakes can happen, they will happen.
You are very naïve.
In a recent comment thread I noted that my father’s generation went from fighting a bitter war with Vietnam to Apple building MacBooks there. My grandfather landed on D-Day and drove Volkswagen Beetles for most of his life.
None of us know what will happen in the future. China could become a close ally. They’re already an existential economic partner.
The only path to real privacy is personal sovereignty. If you don’t control the data it is public. Period.
> You are very naïve.
I am your of the same generation of your father...
My grandfather was already 40 years old when D-Day landed.
> None of us know what will happen in the future
It's safe to assume that it doesn't matter.
You could die tomorrow, so why are you worrying?
> If you don’t control the data it is public
Unless the network is firewalled by Chinese government...
Safety is not about paranoia, but about layers.
car alarms aren't there to make it impossible to steal your car, but only to make it inconvenient for the thief and convince them to steal someone else's car.
1 reply →
> If you don’t control the data it is public. Period.
Funny, censorship (which this case somewhat is) is about making things not public. Though I somewhat agree.
> Because they are not in contact with our authorities and, frankly, the chances my private files will be of any interest for Chinese authorities are close to zero.
The problems that befell the fellow in this story were not due to Google being in contact with the authorities. The authorities unobtrusively investigated, determined that the reports were false positives, and closed the case.
If all Google had done was contact authorities he would have never even known that he was investigated, and there would have been no impact at all on his life.
China has bans in most of the same categories that the US and other western countries do, but typically broader (e.g., they broadly ban pornography). If ISPs there are on the lookout for things China bans you are probably more likely to have a false positive there than with a western ISP.
The question then is a Chinese ISP more likely to overreact on a false positive than a western ISP? I believe China is more likely to hold a business responsible for the bad acts of that businesses customers, which I expect would lead to Chinese businesses being more likely to overreact.
This is the same situation with "private" search engines. Your search query is ironically less likely to be shared with government if you are using Yandex, than with DuckDuckGo that is hosted by Microsoft (before it was even Amazon).
Those countries would love to blackmail you. Just encrypt your data rclone would do that.
> Those countries would love to blackmail you
based on what?
At least is an entire Country that will blackmail me, must think I am really important, not some rando that hacked iCloud to find celebrities boobs and post them online...
> Just encrypt your data rclone would do that.
yeah, but rclone is an offline backup, basically.
cloud storage is for when you need immediate access and search capabilities.
7 replies →
This is fine as long as you don’t plan to travel to China in the future.
Please explain why this is different from traveling to any other country.
I haven't been to the US since they made it legal for custom officers to search travelers' personal electronics without a warrant and deny entry if you refuse, because, thanks but no thanks.
9 replies →
Or to any China-aligned or vassal states, they don't need to bother with complex extradition procedures, they just ask the government 'nicely'. eg https://www.reuters.com/article/us-cambodia-china-uighurs-id...
1 reply →
That's an incredibly bad idea, you are exposing yourself to a lot more risk, than using a western cloud.
They are not in contact with our authorities, until they decide they want to. It's not like you are a Chinese factory owner making counterfeit wranglers, I doubt they would deny any kind of request from a western government about a westerner.
Your access to the service is also under risk, at any time there could be a breakdown of relations leaving you unable to pay for the service, that would have happened already, if you had chosen Russia instead of China.
Your behavior also looks suspicious to the western intelligence apparatus. Sending potentially terabytes to Chinese servers as a private individual may very well put you on their radar.
As others have noted you are setting yourself up as a prime candidate for an intelligence asset, they could at any point blackmail you to perform any action they want.
With what would they blackmail you? The terrabytes of CSAM they could at any point plant in your account. Do you think they would be above doing that, if they had anything to gain and were aware that you exist? Or do you think your Chinese provider would require a court order to give you up? Your entire bet is that they don't know that you exist.
My main point as advice to others mostly is you shouldn't put your self in the hands of your adversary.
I don't even know how you trust their software to run on your system.
PS: If you think all these are farfetched and paranoid, I will remind you that China routinely takes hostages to achieve diplomatic concessions https://en.wikipedia.org/wiki/Hostage_diplomacy#China
TBH my worry in this sort of setup is China cutting you off. Say tensions increase and the CCP throws down a decree that says all Chinese sites must block themselves from being reached by the US/Western Countries.
Can you recommend you provider? I've been wanting to do the same thing for a while.
can't edit the post anymore
My future proof solution will be hosting my data on bare metal in Iceland.
They are quite serious about data privacy.
This. China, unlike the U.S., is not gonna ask other countries to extradite me if they ever find me in violation of whatever bullshit law they have.
That's because it would be largely ineffective. States use the powers they have.
1 reply →
My parents gave me an old photo album I was going to digitize. It includes a photo of me age 1 having a bath in a wash tub.
It never occurred to me that this might get my account banned.
Given this story, you’d expect Google to make a yearly report saying that they successfully threw X number of pedophiles off their services, and the FBI convicted Y% of them. You’d think it would be something they and the government would love to crow about. But they don’t. Why?
Google used to have something on their front page to the tune of "What other people are searching right now!"
To sort of add a lively "Here's what's trending!" kinda thing
They pulled that feature really fast when it made users acutely aware that their searches were not at all private.
So... Likely this reason.
I wonder what happens if I share the Nevermind album cover?
The copyright filter nails you to flaming pentagram while the priests of Syrinx surround you and chant. In the style of Rush 2112.
(For the first time ever, I wish I had a DALL-E 2 account)
Or Virgin Killer by Scorpions.
I think the baby from that cover tried his luck, and lost, in court recently
https://en.wikipedia.org/wiki/Nevermind#2021_lawsuit
Did Google actively scan and classify user’s photos? Because it seems unlikely that these particular photos would match one from the CSAM database.
That is the implication of this story. Probably flag by ai, review by hand, report. I doubt it is unique to hangouts, probably anything cloud.
> A Google spokeswoman said the company stands by its decisions, even though law enforcement cleared the two men.
Wow. Just wow. This is worse than the usual Google's automated screw-ups. In this case, Google was notified of the issue by the NYT. Yet they actively chose to continue to screw over their victims just because they can.
> In a statement, Google said, “Child sexual abuse material is abhorrent and we’re committed to preventing the spread of it on our platforms.”
Just how tone deaf can Google be, continuing to treat these innocent folks as criminals in this passive aggressive statement even after being proven wrong? Do these people have no empathy at all?
I suppose it’s defensive behavior. If they admit their mistake now then they could potentially be liable for the damages caused by their mistake years ago. Now any lawsuit would need to determine if there was an error and harm instead of just quantifying the harm.
I’d like to contribute to a crowdsource fund to prosecute cases like this.
When I was a kid the Comic Book Legal Defense Fund [0] was set up to pay for lawyers to defend comic book stores that were being targeted by over eager police departments and civil suits.
Maybe something like the Google is an Asshole Legal Defense Fund could collect donations. The article mentions $7000 as the cost to prosecute this persons case. Crowdsourcing can help with that.
[0] https://en.wikipedia.org/wiki/Comic_Book_Legal_Defense_Fund
I am not saying it is right, but to a large degree this is the cost that some of us 'pay' for millions having 'free' Gmail/GDrive etc. Fully automated processes that close accounts, no due process to get them timely reinstated when the machine made an error. You are correct, if they admit a mistake here, it will open the doors to lots of claims. I sometimes think they could a lot of people to pay for the service (with $ not just having their digital lives being harvested) if they knew to be treated better when something like this happens.
The question everyone needs to ask themselves, if Google closed your account right now, for good - what would that do to your life...
7000$ is a pitance. Maybe this case is simple, but many will not be. Say they raid a house and confiscate a hard drive. Encrypted or not, that is going to be a huge thing. Arguments will be made about whether anything incriminating was stored on that drive. Just google the cost of a forensic expert witness. Both sides will need one.
Such costs are actually why so many police agencies are backing off of CP investigations. They still prosecute where evidence is clear, such as when someone emails such material openly, but they arent willing to invest the tens and hundreds of thousands of dollars necessary to handle the complex cases involving encrypted communication/storage. 7000$ would be a bare minimum for only the simplest of legal defenses in the simplest of cases.
13 replies →
The article mentions two independent instances of this process within Google, where appeal is not possible even with a police report that completely exonerates the suspect.
It sounds to me as if a class action lawsuit is the most appropriate remedy for the unfortunates who are caught in this predicament. Their only problem is finding each other.
For the rest of us, it is unwise to use cloud storage for photos, for several reasons.
They don’t block CSAM because “it’s illegal” - in fact, they can’t be forced to do it without it breaking your 4th amendment rights. Instead, all CSAM reporting and blocking is done at-will by these companies, and some don’t participate (Apple[0]), so it’s a policy decision by these companies.
I imagine unblocking someone due to them being exonerated by a government entity is legally risky - perhaps doing so would be considered enough proof/evidence to deem the entire CSAM scanning practice as a search/seizure at the request of the government.
0: https://www.hackerfactor.com/blog/index.php?/archives/929-On... • “ According to NCMEC, I submitted 608 reports to NCMEC in 2019, and 523 reports in 2020. In those same years, Apple submitted 205 and 265 reports (respectively). It isn't that Apple doesn't receive more picture than my service, or that they don't have more CP than I receive. Rather, it's that they don't seem to notice and therefore, don't report.”
10 replies →
It’s unwise to depend on Google with decades of history of poor customer service.
11 replies →
To be fair, users of many internet services exposed themselves. The warnings about that were loud and clear. I don't know if using their free service counts as a formal business relationship, that might make recourse more difficult. I think the service provider has the right to close any and all relationships unilaterally.
Cloud storage for anything valuable.
Note that it was the telehealth provider who provided the images to Google in the first place, not the SF techie dad.
2 replies →
What's the incentive for Google to EVER give the accounts back? If they wrongly deactivate an account (like here), you get a bad article, EFF and friends ruminate about your behavior, and the world mostly moves on.
If Google wrongly gives an account back, you get a different article: "Google helped child pornographer even after discovering CP in their account". Now that gets attention. That's a scandal that leads to political action, criminal charges, etc.
To be clear, I'm not advocating for how Google behaves. They're a lot more like a utility and probably should be treated like one (alongside the protections and requirements that come of a utility, you don't hear "eletric company stopped serving house since man suspect of CP lives there").
For the responses saying "Well the police cleared them", again I don't disagree. But if you're an executive making this decision you're thinking:
1. We never give back an account in this case and avoid the massive downside risk
2. We go through a lot of work to design a process that will impact a marginal portion of customers and really really hope nobody manages to social engineer themselves past, and pray that no enterprising news outlet/politician tries to make the "Google helped CP person recover their CP story" - they already have a target on their back.
In what world would Google receive criticism for giving back accounts to people who has been proven innocent?
Google's surveillance system and automated ban hammers are already bad enough. But the actions they took following the ban in this case is egregious and 100% indefensible. At the very least, Google could've reinstated their victim's account and issued a full sincere apology upon being contacted by the NYT. If Google has any care for their users, they'd do that for every people they wrongly reported who had their names cleared. Instead, Google doubled down, continued to treat their victims as criminals in their statement, and even leaked details about intimate photos in a blatant attempt to discredit the users they wronged.
No parent should ever have to go through what Google has put them through when trying to cure their child. Most of all, they should never have to risk losing custody of their child because their child went sick. They should never lose access to their whole digital identity because they didn't know any better than to rely on Google. Yet this is what Google did to these parents, full stop.
5 replies →
If due process was followed, and the police / state exonerated the parents, I don’t think anyone would blame Google for reactivating the accounts. They’d blame the police or whatever flawed exoneration process was used.
At least, that’s what I’d hope.
Google here looks even worse than I thought possible, and I’m a de-Googled, anti-fan, so I already had a very dim view of them.
2 replies →
What's the incentive for Google to EVER give the accounts back?
Ethical, moral responsibility? Being a nice entity. And it takes zero effort to do so, especially once he is cleared by the cops?
You know, stuff like Don't be evil? Oh wait...
Employees of this office are very small and delicate, deserve protection from local pervs. Better a thousand innocent men are locked up than one guilty man roam free.
— Dwight K. Schrute
Google is not anything like a “utility”. A utility has a natural monopoly because of the effort, expense and disruption that is required to lay down the infrastructure and the need for scale. There is no product that Google has that you can’t and shouldn’t pay for a competitors product.
Welcome to fascism.
A private company is the de-facto judge, jury and executioner because it owns half the infrastructure you live your life on.
With the flick of a switch, they can deploy this technology to a global scale and way beyond protecting children. Imagine the feature propagating to their Chrome browser or their smart speakers "listening in" on what's happening in your home to "prevent" crimes by sending the cops whenever you raise your voice or say "the wrong things." This kind of power should not be in the hands of a single company.
Edit: I recently read this book called "The Every" which explores a similar scenario https://www.goodreads.com/en/book/show/57792078-the-every
9 replies →
And you can’t even escape these companies by moving to another country. They have their tendrils everywhere, except in places that consciously prevent them - which is usually done by even worse tyrannies like China/Russia.
Always makes me laugh when I see employees of Google, Amazon etc. claiming to be “anti-fascist” and “standing in solidarity with the common man” etc etc…
35 replies →
"The word fascism has now no meaning except in so far as it signals 'something not desirable.'" --George Orwell
23 replies →
This is at best authoritarianism or corportatism not fascism. There is no ultranationalism, there is no "othering", there is no enforced hierarchy of individuals.
4 replies →
Nope, the final call was made by NCMEC. The article is a bad one, didn't highlight how the process works. And it's a federal law that google is obliged to follow.
https://uscode.house.gov/view.xhtml?req=granuleid:USC-prelim...
That's not what fascism is. Fascism is hyper nationalism where everything is done in service of the nation state. Private entities only exist insofar as they are extensions of the state.
What you're describing here is more like a cyberpunk corporatocracy, where corporations hold so much power independent of the state, that they are able to exercise their own arbitrary decisions extrajudicially, while still maintaining so much power over people so as to completely control their lives.
In fact, here you can see that the person was exonerated by the state but punished by the corporation. In fascism, nothing supercedes the state.
That doesn't really sound like fascism. Closer to feudalism than anything else.
Unfortunately there are some issues that people get so angry over, they'll support fascism openly if it means being against it
You misspelled libertarianism
> Just how tone deaf can Google be, continuing to treat these innocent folks as criminals in this passive aggressive statement even after being proven wrong? Do these people have no empathy at all?
They don't care a single bit about the effect their actions have on others. They only care about having to build a system, which can distinguish such cases from actually criminal ones. Because that wouldn't scale and would be bad for business $$$. So they try to turn and twist the image in the light of the public, that it is "right" what they did, so that the public does not cry out and demand change of their systems. Empathy doesn't even enter the equation for Google.
> They only care about having to build a system, which can distinguish such cases from actually criminal ones
There are only two ways to actually do that:
1) Make Google's policies 100% subservient to the United States legal system, which would look a lot like the "corporate / national lock-step unity" one sees in actual fascism
2) Google build its own court system, independent from the United States court system but with equivalent power
Are either of those scenarios desirable?
1 reply →
I’m amused because it’s a microcosm of the actual problem: it’s probably the default response to anything that has to do with child sexual abuse material, given without thought to context or circumstances, with too little review. But hey, I guess it’s Google’s official position that this dad is a child pornographer ¯\_(ツ)_/¯
> In a statement, Google said, “Child sexual abuse material is abhorrent
I love the use of the disclaimer "sexual" here, to make it clear they don't care about other types of child abuse (like interfering with access to health care, which Google is clearly guilty of in this case...)
Well crafted weasel words, PR folks!
I wonder why the person decided 7k was not worth the lawsuit. Their legal counsel told them it was hopeless to win anything?
I’m more and more convinced that they just never bothered to write the ban-hammer feature in a way that it is actually reversible.
Probably the button simply doesn’t exist to undo the termination 30/60/90 days later. Good luck getting them to admit it.
even though law enforcement cleared the two men
I've said it before and I'll say it again: Google now has enough power that it has effectively turned into a globalist government, a government you did not vote for.
What do you think is the ratio of innocent to nefarious pictures of naked children Google encounters in aggregate?
This is relevant to how outraged one should be by this story. I think it is probably > 1:100000. As such, probably not much outrage is warranted, although it’s obviously not great for this one guy.
Wow, I'd guess the opposite when we consider the base rate. Seems like a classic Bayesian problem.
It's fairly normal for parents to take pictures of their children naked in the bathtub/at the beach/ camping/etc.
Conversely, I'd expect actual pedophiles and CSAM producers to be really quite rare.
So even a relatively low base-rate of normal parents with normal nude photos would likely dwarf CSAM upon detection.
So, if we say 1/100 are pedophiles, and 30/100 are parents, and of the parents 10% have such photos, the ratio I'd expect without getting into detection rates is like 3:1 in favor of normal parents.
9 replies →
If it wasn't clear, people aren't objecting to the use of automated tools to prevent crime. It's that there is absolutely no avenue of appeal or review against it even if the law enforcement exonerates them and a big news media reports it.
Google has inserted itself into almost all spheres of digital life by hook or crook. It's practically difficult to avoid them in many services - especially email. And now they play judge, jury and executioner. I don't understand any of these are acceptable, much less justifiable. The old argument 'think of the kids' used to justify digital authoritarianism is such a cliché by now.
1 reply →
I’d put pretty good money on almost every family having pictures of their naked kids doing some shenanigans. I know I have those pictures. My parents have those pictures of me and my sister. Quite a few, as I seemed to enjoy trying to run about naked…
Bill Waterson somehow managed to sneak watercolor paintings of a naked little boy into every major newspaper under the guise of being a “comic strip”— the perv.
1 reply →
Somehow I think there more parents who sometimes need to take a photo of their naked children than there are paedophiles.
Or at least the ratio is clearly not 1:100000, maybe more like closer to 1:10.
You would need statistics how many times google have reported police and how many times it have turned out to be a false alarm. Does google even keep record of false alarms? Most likely they don't to avoid responsibility.
1 reply →
Well as a data point I have pictures of my children naked. As another data point my parents have photos of me as a child naked, and as a third data point my grandparents have photos of themselves as children naked. Whereas I don't knowingly know any paedophiles.
What are the ways to mitigate these issues for someone that wants a lot of the features of Google Photos? It seems that Amazon Photos is basically copying Google Photos and has a lot of the features. I wouldn't care if my Amazon account was closed down. And it is free if you are already a Prime member.
Do Flickr or Dropbox have the features?
What is not explained in the article is how the telehealth's services came in contact with Google's image-scanning to begin with.
Is this not a huge breach of HIPAA?
Telehealth is sort of done for if randos at bug tech can find there way into your sexual health records.
Telehealth had nothing to do with it other than causing the picture to be taken. The picture was taken with a Google-linked phone, the AI flagged it as CSAM. The transfer to the pediatrician was probably secure and not seen by Sauron's Eye.
> how the telehealth's services came in contact with Google's image-scanning
It didn't. If it had, it wouldn't have been the fathers' accounts being hit.
>Mark’s wife grabbed her husband’s phone and texted a few high-quality close-ups of their son’s groin area to her iPhone so she could upload them to the health care provider’s messaging system.
It sounds like it was probably the texting (maybe via the Google Messages app?) that got the images flagged, rather than the telehealth system.
> When Mark’s and Cassio’s photos were automatically uploaded from their phones to Google’s servers, this technology flagged them. Jon Callas of the E.F.F. called the scanning intrusive, saying a family photo album on someone’s personal device should be a “private sphere.” (A Google spokeswoman said the company scans only when an “affirmative action” is taken by a user; that includes when the user’s phone backs up photos to the company’s cloud.)
I assume it was triggered when the photos were backed up to Google Photos based on the above quote.
..and with CSAM Apple is going to get in the same business. With this latest security update I would not be surprised if CSAM has already been deployed.
Because why on earth would you oppose protecting children? /s
This is why I don't have a Dropbox account anymore.
I am extremely fortunate that the account that was deleted without recourse only contained data I had copies of on my hard drive, and to my knowledge law enforcement isn't involved.
The article fails to mention the stress and trauma of being accused of having CSAM. That remains to this day ... I'm posting from an alt because even the false accusation carries a potentially career and family destroying stigma.
I wonder what google would do if they realized there’s millions of people who live as nudists.
GOOGLE EMPLOYEES: Quit your jobs and do something positive for the world for a change.
"don't be evil"
What if someone emails stock photos to Google execs that are known to trigger Google's child abuse algorithm? They would have to build a way to re-activate banned accounts to get their own accounts back.
I don’t think anyone believes the technical process does not exist.
Someone with enough clout would 100% be able to get accounts reinstated (likely without even needing a police report of no wrong-doing).
Google’s refusal to have appropriate moderation and support for edge-case situations is a completely different thing.
Then send it to tons of lawyers, politicians, judges, etc and a law might get passed forcing Google to reactivate accounts.
Google needs to be a regulated common carrier for many of their services. Then you have a right to service.
> Mark and his wife gave no thought to the tech giants that made this quick capture and exchange of digital data possible
Well... here we are, normal people don't think it's possible to transfer an image over the internet without a megacorp being in the middle of it. Pretty strong sign something has gone wrong.
Maybe they felt like their purchase of a $1000 phone entitled them to the right to use it as they please?
I'm talking about the author of the article not the victims.
Is this the case with Apple iCloud Photo Library, too?
In the article:
"Apple announced plans last year to scan the iCloud for known sexually abusive depictions of children, but the rollout was delayed indefinitely after resistance from privacy groups."
Wasnt that the whole point of building hash database of encrypted child abuse content with Apple?
We have a direct primary care doctor for our children and would never send a photo with genitalia in it. Either she comes for a house call or we come to her. This article confirms my fear.
Sue Google.
Apple Messages encryption is looking pretty good right now. Let's hope they hold the line on CSAM.
Sounds like a great tool. Maybe it can be used for things … like preventing the next 9/11.
They all embed themselves deeply into our communications, for not-so-altruistic purposes--allegedly to "serve us better", realistically to train the shit out of their AIs in the hopes of growing (or at least maintaining) market share. If people weren't such cattle, a hard line would have already been drawn. If...
I just did
Nothing happened
welcome to the list
1 reply →
Just don't make knives. Period. The reasoning doesn't matter. The innocent tool that you use for cooking could be stolen, could be lost, could end up donated to a thrift store where anyone could buy or shoplift it, and the knife could end up in the hands of a serial killer.
Your intent doesn't make it right. And you have to make sure a criminal can never get their hands on one.
What's next, actual serial killers declare themselves chefs and thus can receive and share knives with other chefs "for cooking reasons"?
Please. Don't be so naive guys. Don't make knives. How the hell is that not common sense?
---
That is how rediculous your suggestion about pictures sounds to other people.
lol.
> The context doesn't matter.
That is possibly the most idiotic comment I've read on this thread. Of course it matters.
Yeah, just delay seeking medical treatment (for your child) until this coronavirus thing blows over.
The context is critical. Context is always critical.
Also, I'm not sure why an "actual" paedophile couldn't be an acupuncture professional. I mean it's not really a thing to take your kids to before the GP. To be clear: I don't think that anyone should be able to just declare themself to be some sort of qualified practitioner and make or imply that they can provide a service or results that they cannot. But I'm really unsure why someone calling themselves an acupuncture professional (whether they are qualified to or not) would entitle them to freely trade in sexual abuse material.
You clearly dont have children...
Richard was right.
It doesn’t seem right for the doctor to ask the parents to take pictures then send them over SMS, email, or whatever they asked them to use. Why wouldn’t this just be done within the privacy of the doctor’s office?
Telemedicine has been thing well before 2020.
You shouldn't have to travel to a doctor's office to get privacy. There is nothing wrong with a parent or a doctor taking pictures of a medical condition (rashes, etc).
There could be so many reasons. I will just name a few:
- Corona
- Specialist far away
- Quick Checkup
- Doctor is on vacation
It is beyond me that someone would use email to submit sensitive information. Pandemic aside, you should know better.
Also, I am sorry this happened. It is very human to respond to a person in authority - but we need to be better and start asking questions. It is our privacy at stake.
Hopefully everyone learns from this; Also, Google was doing the right things.
Where does it say that they used email to submit the information?
And I really don't see how Google insisting that banning them was the right thing to do and being cleared by police doesn't warrant undoing the ban is "doing the right things".
Trying to understand why did u get so downvoted.
How can you tell? Maybe I wasn’t clear enough when trying to communicate my viewpoint?