Comment by roncesvalles
18 hours ago
Depends on your definition of "safe". Imagine an adult DMs a nude photo to a minor (or other kinds of predation).
If it's E2EE, no one except the sender and receiver know about this conversation. You want an MITM in this case to detect/block such things or at least keep record of what's going on for a subpoena.
I agree that every messaging platform in the world shouldn't be MITM'd, but every messaging platform doesn't need to be E2EE'd either.
The receiver has a proven and signed bundle, that they can upload to the abuse report. So the evidence has even stronger weight. They can already decrypt the message, they can still report it.
Yes, but this leaves the only way to identify this behavior as by reporting from a minor. I'm not saying I trust TikTok to only do good things with access to DMs, but I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted.
I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.
> I think it's a fair argument in this scenario to say that a platform has a better opportunity to protect minors if messages aren't encrypted
Would it be a fair argument to say the police have a better opportunity to prevent crimes if they can enter your house without a warrant? People are paranoid about this sort of thing not because they think law enforcement is more effective when it is constrained. But how easily crimes can be prosecuted is only one dimension of safety.
> However, an alternative could be allowing the sharing of the encryption key with a parent
Right, but this is worlds apart from "sharing the encryption key with a private company", is it not?
8 replies →
> I'm not saying no E2E messaging apps should exist, but maybe it doesn't need to for minors in social media apps. However, an alternative could be allowing the sharing of the encryption key with a parent so that there is the ability for someone to monitor messages.
The problem with that idea, that you are implying E2E should require age verification. Everyone should have access to secure end to end encryption.
Are you suggesting all messaged photos should be scanned, and potentially viewed by humans, in case it depicts a nude minor? Because no matter how you do that, that would result in false positives, and either unfair auto-bans and erroneous reports to law enforcement (so no human views the images), or human employees viewing other adults' consensual nudes that were meant to be private. Or it would result in adult employees viewing nudes sent from one minor to another minor, which would also be a major breach of those minors' privacy.
There is a program whereby police can generate hashes based on CSAM images, and then those hashes can be automatically compared against the hashes of uploaded photos on websites, so as to identify known CSAM images without any investigator having to actually view the CSAM and further infringe on the victim's privacy. But that only works vs. already known images, and can be done automatically whenever an image is uploaded, prior to encryption. The encryption doesn't prevent it.
Point being, disallowing encryption sacrifices a lot, while potentially not even being that useful for catching child abusers in practice.
I'm sure some offenders could be caught this way, but it would also cause so many problems itself.
1 reply →
SimpleX handles this by sending the decryption keys when the receiver reports the message.
Keeping children safe and prosecuting are too different concepts, only vaguely related. So no, being able to track pdfs doesn't make children safer. What keeps them safe is teaching them safe communication habits and keeping them away from things like Tiktok.
We shouldn't make the world a worse place for every one because some parents can't take care of their children.
>Keeping children safe and prosecuting are too different concepts, only vaguely related.
See also: That time the FBI took over a CSAM site and kept it running so they could nab a bunch of users.
Not necessarily saying what they did was right, but I think there's a strong utilitarian argument to be made that what they did in that case was, in fact, the best way to keep children safe.
What's more dangerous? CSAM on the internet? Or actual child predators running loose?
2 replies →
What was the rate of child exploitation in the GDR?
Ugh. The kids aren't even safe from the people making, and enforcing laws. This argument should be long over for anyone with eyes or ears.
Imagine Hamas are your government and want to figure out who's gay. You don't want a MITM in case they can do this.
Pick your definition of safe.
In that case don't use Tiktok dm's to discuss your sexuality. I think it is strange that people feel like they have to be able to talk on sensitive topics over every interface they can get their hands on.
Similarly in "traditional" media you may not want to discuss such private conversation on a radio broadcast. Perhaps you would rather discuss it on the phone or over snail mail as there is more of an expectation of privacy on those medium.
Right, but it currently isn't a sensitive topic - homosexuality is, as of 2026, broadly legal in the United States. That's a relatively new state of affairs, historically speaking, and one which Afghanistan shared as recently as 2021.
I'm commenting in the context of the conversation, not in a vacuum. You could just as (in fact, much more) easily say that children shouldn't be on apps with private messaging enabled. That would help a lot more, and then we could keep e2ee.
> there is more of an expectation of privacy on those medium
What does the "p" in "pm" stand for?
5 replies →
This is fine if you have TLS encryption and the platform is not local.
Sure, they can fabricate some evidence and get access to your messages, in which case, valid point.