← Back to context

Comment by EagnaIonat

1 month ago

Citation needed. As the latest news suggests the opposite.

> Citation needed.

The best one I remember is this one: https://www.hackerfactor.com/blog/index.php?/archives/929-On...

> As the latest news suggests the opposite.

What news?

  • They don't work on the CSAM. They are a cloud provider who makes CSAM reports.

    The only thing I see that is wrong in what they claimed is this:

    > The problem is that you don't know which pictures will be sent to Apple.

    Apple said in their paper that they don't send anything that is flagged. When the cloud sync occurs the files that are flagged get reviewed. All other files remain encrypted.

    > What news?

    https://www.nytimes.com/2024/12/08/technology/apple-child-se...

    • > They don't work on the CSAM. They are a cloud provider who makes CSAM reports.

      --- start quote ---

      If you sort NCMEC's list of reporting providers by the number of submissions in 2020, then I come in at #40 out of 168. For 2019, I'm #31 out of 148

      I repeatedly begged NCMEC for a hash set so I could try to automate detection. Eventually (about a year later) they provided me with about 20,000 MD5 hashes that match known CP. In addition, I had about 3 million SHA1 and MD5 hashes from other law enforcement sources.

      --- end quote ---

      Somehow this isn't "working with CSAM or CSAM databases", oh well.

      > Apple said in their paper that they don't send anything that is flagged.

      This is literally addressed in the article you pretend you read.

      > https://www.nytimes.com/2024/12/08/technology/apple-child-se...

      I don't have access to this article. How does it show that people working with CSAM say Apple's implementation is a good idea?