← Back to context

Comment by meowface

4 years ago

Can you give more information about this? What kind of legal images might it match?

What about pictures of you own children naked ?

  • Since this is using a db of known images. I doubt that would be an issue. I believe the idea here is that once police raid an illegal site, they collect all of the images in a db and then want to know a list of every person who had these images saved.

    • But it said they use a "perceptual hash" - so it's not just looking for 1:1, byte-for-byte copies of specific photos, it's doing some kind of fuzzy matching.

      This has me pretty worried - once someone has been tarred with this particular brush, it sticks.

      3 replies →

  • This isn't CSAM or illegal, nor would it ever end up in a database. Speaking generally, content has to be sexualized or have a sexual purpose to be illegal. Simple nudity does not count inherently.

    • > Simple nudity does not count inherently.

      That’s not entirely true. If a police officer finds you in possession of a quantity of CP, especially of multiple different children, you’ll at least be brought in for questioning if not arrested/tried/convicted, whether the images were sexualized or not.

      > nor would it ever end up in a database

      That’s a bold blanket statement coming from someone who correctly argued that NCMEC’s database has issues (as in I know your previous claim is true because I’ve seen false positives for completely innocent images, both legally and morally). That said, with the amount of photos accidentally shared online (or hacked), to say that GP’s scenario can not ever end up in a database seems a bit off the mark. It’s very unlikely as sibling commenter said, but still possible.

      9 replies →