Comment by laumars
4 years ago
It’s not just you. I have pictures of my kids playing in the bath. No genitals are in shot and it’s just kids innocently playing with bubbles. The photos aren’t even shared but they’d still get scanned by this tool.
This kind of thing isn’t even unusual either. I know my parents have pictures of myself and my siblings playing in the bath (obviously taken on film rather than digital photography) and I know friends have pictures of their kids too.
While the difference between innocent images and something explicit easy for a human to identify, I’m not sure I’d trust AI to understand that nuance.
> No genitals are in shot
That you even have to consider sexual interpretations of your BABY'S GENITALS is an affront to me. I have pictures of my baby completely naked, because it is, and I stress this, A BABY. They play naked all the time, it's completely normal.
Yeah that’s a fair point. The only reason I was careful was just in case those photos got leaked and taken out of context. Which is a bloody depressing thing to consider when innocently taking pictures of your own family :(
I might have phrased that ambiguously, I mean "an affront to me" as in "to me, that's an affront", not that you have somehow insulted me.
I say let children be free, no court is going to indict you because you have baby pictures on your phone.
1 reply →
Don't immediately take affront, take the best possible interpretation of the parent comment. This is about automatic scanning of people's photo libraries in the context of searching for child pornography, presumably through some kind of ML. It seems to me that the concern of the commenter is that if there are photos of their child's genitals that they'll be questioned about creating child pornography, not that they're squeamish about photographing their child's genitals. This happened in 1995 in the UK: https://www.independent.co.uk/news/julia-somerville-defends-...
I mean I’m offended on behalf of the parent poster.
Indeed, I'm guessing this must be some cultural shift that was successfully implanted in some cultures because I too find the idea completely bonkers.
If babies playing naked are now child pornography, most Europeans above 50 should be jailed.
As that seems unlikely, I guess CSAM just uses a constantly updated database of known hashes for matching.
1 reply →
> While the difference between innocent images and something explicit easy for a human to identify, I’m not sure I’d trust AI to understand that nuance.
In this case it’s not AI that’s understanding the nuance, it’s authorities that identify the exact pictures they want to track and then this tool lets them identify what phones/accounts have that photo (or presumably took it). If ‘AI’ is used here it is to detect if one photo contains all/part of another photo, rather than to determine if the photo is abusive or not.
Although there is a legitimate slippery slope argument to be had here.
Is there some way of verifying that the fingerprints in this database will never match sensitive documents on their way from a whistleblower to journalists, or anything else that isn't strictly illegal? How will this tech be repurposed over time once it's in place?
You seem to be suggesting that the AI will go directly from scanning your photos for incriminating fingerprints to reporting you to journalists.
I have to assume humans are involved at some point before journalists are notified. The false-positive will be cleared up and no reputations sullied (except perhaps the reputation of using AI to scan for digital fingerprints).
2 replies →
> While the difference between innocent images and something explicit easy for a human to identify, I’m not sure I’d trust AI to understand that nuance.
I recall a story several years ago where someone was getting the film developed at a local drugstore, and the employee reported them for CP because of bath photos. This was definitely a thing before computers with normal every day humans.