← Back to context

Comment by hogwasher

8 hours ago

Are you suggesting all messaged photos should be scanned, and potentially viewed by humans, in case it depicts a nude minor? Because no matter how you do that, that would result in false positives, and either unfair auto-bans and erroneous reports to law enforcement (so no human views the images), or human employees viewing other adults' consensual nudes that were meant to be private. Or it would result in adult employees viewing nudes sent from one minor to another minor, which would also be a major breach of those minors' privacy.

There is a program whereby police can generate hashes based on CSAM images, and then those hashes can be automatically compared against the hashes of uploaded photos on websites, so as to identify known CSAM images without any investigator having to actually view the CSAM and further infringe on the victim's privacy. But that only works vs. already known images, and can be done automatically whenever an image is uploaded, prior to encryption. The encryption doesn't prevent it.

Point being, disallowing encryption sacrifices a lot, while potentially not even being that useful for catching child abusers in practice.

I'm sure some offenders could be caught this way, but it would also cause so many problems itself.

> Are you suggesting all messaged photos should be scanned, and potentially viewed by humans, in case it depicts a nude minor?

No, I was not suggesting that.