Comment by _jal

4 years ago

Can you point to any examples of these leaks?

(Edit: Thanks for the links.)

https://www.theverge.com/22611236/epic-v-apple-emails-projec...

#71, from the Epic v. Apple anti-trust trial discovery.

  • That's referring to iMessage. Predators grooming children is not something the on-device photo library scanning is going to fix.

    They could let users report such messages to Apple, but that requires having a team of humans to review iMessage reports.

    • The CASM perceptual hash scanning is not the only new thing, there's also on-device machine learning algo used by Messages app attempting to identify "sexually explicit photos" as well:

      "The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos. When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

      Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages." -- https://www.apple.com/child-safety/

      While that's billed as "available for Family accounts in iCloud", so _probably_ opt in, it's another piece of software I never asked for and don';t want on the phone I bought.

      1 reply →

    • They could pay 10,000,000 total per year to 100 people and it'd be a rounding error, with their nearly 2,000,000,000,000 market cap.