Comment by ramesh31
4 years ago
What's actually scary here is that these were newly taken photos; not existing CSAM material flagged by hash value. That means Google is doing real time image recognition on all of your photos. And that means Google has an ML model somewhere trained on millions of pictures of.... yeah this is fucked up.
From a link the article:
While historical approaches to finding this content have relied exclusively on matching against hashes of known CSAM, the classifier keeps up with offenders by also targeting content that has not been previously confirmed as CSAM.[1]
[1] https://www.blog.google/around-the-globe/google-europe/using...