← Back to context

Comment by aphroz

4 years ago

What about pictures of you own children naked ?

Since this is using a db of known images. I doubt that would be an issue. I believe the idea here is that once police raid an illegal site, they collect all of the images in a db and then want to know a list of every person who had these images saved.

  • But it said they use a "perceptual hash" - so it's not just looking for 1:1, byte-for-byte copies of specific photos, it's doing some kind of fuzzy matching.

    This has me pretty worried - once someone has been tarred with this particular brush, it sticks.

    • You can’t do a byte-for-byte hash on images because a slight resize or minor edit will dramatically change the hash, without really modifying the image in a meaningful way.

      But image hashes are “perceptual” in the sense that the hash changes proportionally with the image. This is how reverse image searching works, and why it works so well.

      2 replies →

This isn't CSAM or illegal, nor would it ever end up in a database. Speaking generally, content has to be sexualized or have a sexual purpose to be illegal. Simple nudity does not count inherently.

  • > Simple nudity does not count inherently.

    That’s not entirely true. If a police officer finds you in possession of a quantity of CP, especially of multiple different children, you’ll at least be brought in for questioning if not arrested/tried/convicted, whether the images were sexualized or not.

    > nor would it ever end up in a database

    That’s a bold blanket statement coming from someone who correctly argued that NCMEC’s database has issues (as in I know your previous claim is true because I’ve seen false positives for completely innocent images, both legally and morally). That said, with the amount of photos accidentally shared online (or hacked), to say that GP’s scenario can not ever end up in a database seems a bit off the mark. It’s very unlikely as sibling commenter said, but still possible.

    • >That’s not entirely true

      That's why I said it's not inherently illegal. Of course, if you have a folder called "porn" that is full of naked children it modifies the context and therefore the classification. But, if it's in a folder called "Beach Holiday 2019", it's not illegal nor really morally a problem. I'm dramatically over-simplifying of course. "It depends" all the way down.

      >That’s a bold blanket statement

      You're right, I shouldn't have been so broad. It's possible but unlikely, especially if it's not shared on social media.

      It reinforces my original point however, because I can easily see a case where there's a totally voluntary nudist family who posts to social media getting caught up in a damaging investigation because of this. If their pictures end up in the possession of unsavory people and gets lumped into NCMEC's database then it's entirely possible they get flagged dozens or hundreds of times and get referred to police. Edge case, but a family is still destroyed over it. Some wrongfully accused people have their names tarnished permanently.

      This kind of policy will lead to innocent people getting dragged through the mud. For that reason alone, this is a bad idea.

      1 reply →

    • I believe both you and the other poster, but I still haven't seen anyone give an example of a false positive match they've observed. Was it an actual image of a person? Were they clothed? etc.

      It's very concerning if the fuzzy hash is too fuzzy, but I'm curious to know just how fuzzy it is.

      6 replies →