← Back to context

Comment by odo1242

3 days ago

Among many many issues: Apple used neural networks to compare images, which made the system very exploitable. You could send someone an image where you invisibly altered the image to trip the filter, but the image itself looked unchanged.

Also, once the system is created it’s easy to envision governments putting whatever images they want to know people have into the phone or changing the specificity of the filter so it starts sending many more images to the cloud. Especially since the filter ran on locally stored images and not things that were already in the cloud.

Their nudity filter on iMessages was fine though (I don’t think it ever sends anything to the internet? Just contacts your parents if you’re a minor with Family Sharing enabled?)

> once the system is created it’s easy to envision governments putting whatever images they want to know people have into the phone

A key point is that the system was designed to make sure the database was strongly cryptographically private against review. -- that's actually where 95% of the technical complexity in the proposal came from: to make absolutely sure the public could never discover exactly what government organizations were or weren't scanning for.