← Back to context

Comment by hedora

4 years ago

They cancelled the CSAM scanning project. In contrast, Google has been doing CSAM (and apparently AI porn detection) for years.

Also, the bone-headed Apple plan had safeguards that would have prevented the victim in the article from losing their account.

The two policies aren't remotely comparable.

You’re right, they’re not comparable: Apple was going to be scanning for CSAM on your device, without the photos even reaching their servers (by uploading a backup, texting a copy, etc). Google, so far as I can tell and as corroborated by the article, doesn’t do that. Apple and Google both already scan anything uploaded to your Photos account for CSAM and report to authorities.

Yeah, Apple was annoying (not even that really, it’s just the idea rather than the actual effect that bothers me), but ultimately fairly benign compared to what Google does.