Comment by ulfw
4 years ago
I'm a little bit confused here and hope maybe some of you can clear this up.
My parents took lots of photos of me as a baby/small child. Say lying naked on a blanket or a naked 2yr old me in a kiddie pool in the summer in our backyard. Those are private photos and because it was the 1970s those were just taken with a normal non-digital camera. They were OBVIOUSLY never shared with others, especially outside immediate family.
Transform that into the 2020s and today these type of pictures would be taken with your iPhone. Would they now be classified as child pornography even though they weren't meant to be shared with anyone nor were they ever shared with anyone? Just your typical proud parent photo of your toddler.
Sounds a bit like a slippery slope, but maybe I am misunderstanding the gravity here. I'm specifically highlighting private "consumption" (parent taking picture of their child who happens to be naked as 1yr olds tend to be sometimes) vs "distribution" (parent or even a nefarious actor taking picture of a child and sharing it with third parties). I 100% want to eliminate child pornography. No discussion. But how do we prevent "false positives" with this?
As with all horribly-ill-defined laws, it depends how the judge is feeling that day and their interpretation of the accused's intent. If the case can be made that the images arouse inappropriate gratification, they can be deemed illegal.
If that sounds absurd - most laws are like that. For better or worse, there's a human who interprets the law, not a computer. It's unfortunate Apple is choosing to elect a computer as the judge here, for exactly concerns like yours.
I believe there is a large database of known child pornography.
Unless someone has been distributing photos of your kids as child porn (which would probably be good to know) it's unlikely any of your photos will match the hashes of the photos in that database.
I'm not sure that's how it works, but that's what I've gathered from the other comments on this post.
The idea is it detects specific images humans classified. Not any unclothed child.
So far. Many websites already use NN trained to detect any nudity. It is only a matter of time before it lands on all consumer computing devices. The noose will keep on tightening because people keep debating instead of protesting.
Nudity detectors are decades old. People use them when they want to block any nudity. Making iCloud servers scan for nudity would have been simpler than making this new system. And pornography is easier to access than ever for most people.