TikTok will not introduce end-to-end encryption, saying it makes users less safe

21 hours ago (bbc.com)

I think this is... fine? Am I just totally naive. I think it's fine to say "You don't really have privacy on this app" - as long as there are relatively good options of apps that do have privacy (and I think there are). TikTok is really a public by default type of social media, there's not much idea of mutual following or closed groups. So sure, you don't have privacy on tiktok, if you want it you can move to snapchat or signal or whatever platform of your choice.

Like, it's literally a platform that was run under the watchful eye of the CCP, and now the US version is some kleptocratic nightmare, so I just don't see the point in expecting some sort of principled stance out of them.

In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform. If you're going to embrace 'privacy' I do think it's on you to also then put additional resources into tackling the downsides of that.

  • Tiktok has private messaging, and it is used by hundreds of millions of people.

    IMO no consumer service should have private 1:1 messaging without e2e. Either only do public messaging (ie. Like a forum), or implement e2e.

    • Tiktok has direct messages, they don't even call them private.

      It's better that they're honest about this, nobody should believe for a second that WhatsApp or FB messages are truly E2EE.

      DM on social media shouldn't be used for anything remotely private. It's a convenience feature, nothing more.

      31 replies →

    • In my experience most forums have private messaging.

      Additionally I think it is fine to say "we don't support e2ee". I prefer honesty to a bad (leaky) e2ee implementation, at least the user can make an informed choice.

      2 replies →

    • Adding that private self hosted forums can permit uploads of encrypted files, encrypted with a pre-shared secret or a secret shared over a private self hosted Mumble voice chat server.

  • > as long as there are relatively good options of apps that do have privacy (and I think there are)

    Once you have enormous network effect like TikTok has, you don't really have any free selection of alternative apps. You are free to use one, but you will be the only sad user over there.

    Regulations are needed that would force large platforms like TikTok and Instagram to enable federation, opening them up to actual competition. This way platforms would be able to compete on monetisation and usability, instead of competing on locking in their precious users more strictly.

    • “Will we ever end the MySpace monopoly?”

      > MySpace is well on the way to becoming what economists call a "natural monopoly". Users have invested so much social capital in putting up data about themselves it is not worth their changing sites, especially since every new user that MySpace attracts adds to its value as a network of interacting people.

      > "In social networking, there is a huge advantage to have scale. You can find almost anyone on MySpace and the more time that has been invested in the site, the more locked in people are".

      https://www.theguardian.com/technology/2007/feb/08/business....

      1 reply →

    • >Regulations are needed

      Lolololol. No, not regulations. Regulators. With the people we currently have voted into office in the US the only regulations we are going to get are ones saying Sam and Peter must look at everything you do all the time.

      Until we stop voting for more authoritarianism, expect ever increasing amounts of authoritarianism.

      1 reply →

    • federation would never work. How would it work here? Either you are forcing tiktok to give pageviews to federations of spam, or you are letting tiktok decide which federations to work with, which essentially results in no federation.

  • I am fine TikTok remaining that 'we watch what you are doing' platforms. Those do not care can gave that if they wish, I do not mind.

    But bullshitting about it is making users more safe, that is ... bullshit! Worse that that, distorting public opinion, intentionally fooling the gullible.

  • It might be fine if they presented an honest choice.

    They are lying straight off though... police and safety team don't read messages only "if they needed to" to keep people safe. They do so for a large variety of other reasons, such as suppressing political dissent and asserting domination and control.

    I don't think we can expect most people to understand TikTok's BS here either. I notice even a skeptic like you is uncritically echoing the dubious conflation of privacy and CSAM.

    • Anyone who doubts the requirement for e2e messaging should not be considered a skeptic, they are fully buying into whatever narrative LEO would like you to believe.

  • Fine with me too. I think many other apps (WhatsApp, FB, etc.) are using E2EE for PR purposes and are not actually good implementations of E2EE.

    Good implementations of E2EE:

    1. Generate the key pairs on device, and the private key is never seen by the server nor accessible via any server push triggered code.

    2. If an encrypted form of the private key is sent to the server for convenience, it needs to be encrypted with a password with enough bits of entropy to prevent people who have access to the server from being able to brute force decode it.

    3. Have an open-source implementation of the client app facilitating verifiability of (1) and (2)

    4. Permit the users to self-compile and use the open-source implementation

    If company isn't willing to do this, I'd rather they not call it E2EE and dupe the public into thinking they're safe from bad actors.

  • That it’s fine because it’s the CCP (commies see all) is a new one.

    It’s at best subpar for the same reasons as if it was the usual Silicon Valley spyware.

    I could leave well enough alone. But why? Because there are choices? There are five other brands of cereal that do not have 25% sugar? I’d rather be a negative nancy towards these on-purpose addictive, privacy-leaking attention pimp apps.

  • No, saying that e2e encryption makes users _less_ safe is completely dishonest, nothing is fine about this.

    The logic of "anything is better than before" is also fallacious.

    • Depends on your definition of "safe". Imagine an adult DMs a nude photo to a minor (or other kinds of predation).

      If it's E2EE, no one except the sender and receiver know about this conversation. You want an MITM in this case to detect/block such things or at least keep record of what's going on for a subpoena.

      I agree that every messaging platform in the world shouldn't be MITM'd, but every messaging platform doesn't need to be E2EE'd either.

      28 replies →

    • It makes certain users less safe in certain situations.

      E2E makes political activists and anti-chinese dissidents safer, at the cost of making children less safe. Whether this is a worthwhile tradeoff is a political, not technical decision, but if we claim that there are any absolutes here, we just make sure that we'll never be taken seriously by anybody who matters.

      4 replies →

    • well having no e2e encryption is safer than having a half-baked e2e encryption that have backdoor and can be decrypted by the provider.

      and for tiktok's stance, I think they just don't want to get involved with the Chinese government related with encryption (and give false sense of privacy to user)

  • Trying to gaslight the public into thinking end to end encryption makes users less safe is not fine.

  • >I think it's fine to say "You don't really have privacy on this app"

    Disagree. To analogize why: privacy isn't heated seats, *its seat belts*. Comfort features and preferences are fine to tailor to your customers and your business model. Jaguar targets a different market than Ford, and that's just fine.

    Safety features should be non-negotiable for all. Both Jaguar and Ford drivers merit the utmost protection against injury in crashes. Likewise, all applications that offer user messaging functionality should offer non-defective, non-harmful versions of it. To do that, e2e privacy is absolutely necessary.

    >I just don't see the point in expecting some sort of principled stance out of them.

    This is the defeatism that adds momentum to a downhill trajectory. Exactly the opposite approach arrests the slide - users expecting their applications and providers to behave in principled ways, and punishing those who do not, are what keeps principles alive. Failing to expect lawful and upright behavior out of those you depend on, be they political leaders or software solutions providers, guarantees that tomorrow's behavior will be less lawful and upright than yesterday's. Stop writing these people a pass for this horrible behavior, and start holding them unreasonably accountable for it, then we'll see behavior start to change in the direction that we mostly all agree that it needs to.

    The most effective protests against internet censorship came from massive grass roots movements, with users drawing a line in the sand that they will not tolerate further impositions on their freedom.

    >In some ways I think it's worse for places like Facebook to "care about privacy" and use E2EE but then massively under-resource policing of CSAM on their platform.

    The irony is so manifest of billions of people having their privacy stripped by politicians and business elites in the name of protecting our children, while those politicians and business elites conspire en masse to prey on and sex traffick our children. If these forces actually took those concerns seriously, rather than sensing them as an opportunity to push ulterior motives, they'd be eating each other alive, right now. Half of DC, half of Hollywood, and at least a tenth of most major college administrations would ALL be at the docket.

    • Tesla doesn't have parking sensors. They're a safety feature. There's lots of safety features in cars that are optional, we've got an entire rating system for the safety of cars.

      We're talking about an app that's controlled by the CCP, I do expect them to take a principled stance - stances like Taiwan is a part of China and you can't be openly critical of the leader of the party. They don't have the same principles as you. You can force them to put in E2EE, but you can't force them to be honest about it or competent about it. I would rather know what we're getting than to push them to lie.

      This is the same thing as the OpenAI/Anthropic thing. You've got Anthropic taking a principled stance and getting pain for it, and you've got OpenAI claiming to take the same stance, but somehow agreeing to the terms of the DoW. Do you think it's more likely that Anthropic carelessly caused themselves massive trouble, or do you think OpenAI is claiming to have got the concessions that clearly won't work in practice. I think it's naive to think the former.

      1 reply →

Brilliant. They're repackaging the argument governments have long made about E2EE being dangerous to children.

  • Children are just too effect of a tool when building a surveillance state. We should have banned children from owning open computers a long time ago just like we do with Alcohol, Driving licenses, etc.

    Instead children would own special devices that are locked down and tagged with a "underage" flag when interacting with online services, while adults could continue as normal. We already heavily restrict the freedom of children so there is plenty of precedent for this. Optionally we could provide service points to unlock devices when they turn 18 to avoid E-waste as well.

    This way it's the point of sale where you provide your ID, instead of attaching it to the hardware itself and sending it out to every single SaaS on the planet to do what they wish.

    • Would be a nightmare to implement and achieve the goal, but I have to say I think it’s more right than wrong. All of the data is very clear about the harms.

      China has restrictions for social media and screen time for kids — how do they implement this?

      25 replies →

    • The most important principle in the modern age is the freedom to prey on wallets. You can’t give parents tools to conveniently restrict what their children do. Impressionable minds ought to live in a lord of the flies state where they are bombarded with stuff to nag to their parents about and give them FOMO about what their friends have that they don’t have.

      That’s why children must be free.

    • At the same time, I remember growing up in the internet's wild west and bad encounters weren't an issue for me because of the golden rule I was taught from the start: you don't give your personal information and you don't interact with complete strangers. Learning to navigate the web instead of being in a walled garden was helpful in many ways.

      The better question to ask ourselves is, does the capability to gather more information also lead to more power to act on this information? If the investigative resources are spread thin already it's not like they're gonna catch more criminals with investing more there. Repelling questionable individuals off the platform with lots transparancy -is- an effective way, but just a specific tool for a symptom.

      I think a part of a better solution is to give parents and children better tools to manage their social graph themselves. Essentially the real problem is discovery and warding off of social outliers in a way that doesnt out all responsibility on opaque algos or corporations.

      A part of their e2e keys could be shared using an intentionally obtuse way like mailing an item or a physical "friend code". That way parents and vetted friends can have their privacy. You don't need to tie an id to someone's person to get positive confirmation on someone's poor behaviour. If someone crossed the line then parents can see it and escalate. In additon, what would happen to a child with abusive parents who can then arbitrarily restrict and deny a childs freedom to communicate? I did not have this myself, but without free access to other minds and information I would have been duller. Does a large information dragnet really serve our collective interests or are more precise tools needed?

      2 replies →

    • Locking down children’s devices doesn’t stop adults sharing illegal content with other adults though, so there would still be pressure to monitor communications between adults.

      2 replies →

    • Indeed way past time. Though no CEO would admin publicly what the addiction to attention/social media, gaming, and general screen use, causes to children. Of course this should've been regulated similarly to Alcohol, but billions would dry and it's much easier to witch-hunt marijuana, and illegal raves, right?

    • > Instead children would own special devices that are locked down and tagged with a "underage" flag when interacting with online services, while adults could continue as normal.

      California is mandating OSes provide ages to app stores, and HN lost their mind because it's a ban on Linux.

      2 replies →

    • > while adults could continue as normal.

      After providing their identities to prove they are adults, and having all their activities tracked wherever they go and whatever they do.

      The first 18 years aren't freedom either, just the system prepping you for what's ahead.

  • I don't understand why all teh child safety systems require age verification. Why not have a single setting on a smartphone that sends a 'child' flag to every single app or website, which then reacts accordingly? As long as you ensure that the browser can't be changed or modifed, it should be fine.

    • The California law works this way, and it doesn't even have to require the browser can't be modified

    • Then adults could lie about their age and benefit from the data protection laws only granted to children for some reason.

  • Does it matter. It's just some arbitrary company. They do have the freedom to decide those things however they want, right? The customer can then decide whether to switch or not.

  • Ultimately your neighbors must buy the argument. The reason why this argument wins is not because framing is so tricky, but because it connects with the values of your neighbors. Trying to convince people that these aren't actually their values is swimming upriver.

  • The solution is simple: Take away the argument by blocking children's access to social media. Win-Win.

DMs are akin to private conversations in real life. Thus, every DM feature should entail E2EE.

It’s ok for a platform to not feature private conversations. They should just have no DM feature at all, then; make all messages publicly visible.

Private conversations are indeed not for all ages. Parents should be able to grant access to that on individual basis.

  • > They should just have no DM feature at all, then; make all messages publicly visible.

    This makes no sense.

    I can discuss something in a bar which is not a very private conversation, I wouldn't care if someone else hear what I'm saying. But I also don't want someone to record it and post it on the internet to be seen by the whole world.

    Privacy is not just boolean you toggle somewhere.

    • I suppose they mean that apps should brand their non e2ee chat features as private or personal, which is what users take as the default assumption when interacting in one to one chat.

    • In a bar you're not speaking directly into a microphone that is permanently saving everything you say for later instant access by every government and advertising agency that wants to prosecute you or invade your privacy to sell you something

  • Ah, but you see, soon TikTok will allow parents to spy on their children's DMs, and parents will love this.

    • Isn't that something we asked for? We keep asking for parents to parent their children instead of getting age verification laws, and that is what that looks like.

  • I fail to see the link between private conversations/DM and E2EE.

    To quote a comment I made some time ago:

    - You can call your service e2e encrypted even if every client has the same key bundled into the binary, and rotate it from time to time when it's reversed.

    - You can call your service e2e encrypted even if you have a server that stores and pushes client keys. That is how you could access your message history on multiple devices.

    - You can call your service e2e encrypted and just retrieve or push client keys at will whenever you get a government request.

    E2EE only prevents naive middlemen from reading your messages.

    • Fundamentally actual E2EE is complicated problem. And probably not very user friendly. It is full of technical trade-offs. And mistakes are very common. Or they lead to situations that people do not want. Like if you lost your phone or it break how do you get history back... What if you also forgot password? Or it was stored in local manager...

      It is phrase that sounds good. But actually doing it effectively in way that average user understand and can use system with it with minimal effort is very hard.

  • > DMs are akin to private conversations in real life

    There are parents out there who would record and AI-analyze every single private conversation their kids have if only the technology enabled it.

  • You could have reasonable legal system where privacy is guaranteed. But you do not need end to end encryption for that to be thing. It really is orthogonal issue.

  • Sure, however kids these days often can't socialize irl - should kids be isolated from friends because they're unable to have any private conversations at all?

    During times in which I was unable to socialize irl (eg school holidays), and unable to talk to my friends online, I can confirm that the isolation was not good for my mental health.

This might be off-topic but on-topic about child safety... but I'm surprised people are being myopic about age verification. Age verification should be banned, but people ignore that nowadays most widely used online services already ask for your age and act accordingly: twitter, youtube, google in general, any online marketplace. They already got so much data on their users and optimize their algorithms for those groups in an opaque way.

So yeah, age verification should be taken down, as well as the datamining these companies do and the opaque tunning of their algorithms. It baffles me: people are concerned about their children's DMs but are not concerned about what companies serves them and what they do with their data.

  • > people are concerned about their children's DMs but are not concerned about what companies serves them and what they do with their data.

    Hogwash.

    Where are these mythical people who aren’t concerned with both?

    • > Where are these mythical people who aren’t concerned with both?

      People don't care about "what companies serve them". They only care if the children see sexual content (or things considered deviant). Once sexual and deviant content is filtered, they're happy to give away their children's development to the company's algos.

      In effect, the people don't want to concern themselves with what their children consume, unless they're outraged by things normally taboo in their age group. Besides, if everyone is in it "it's not that wrong". They seek reactive entertainment rather than proactive engagement in their children's development.

    • > Where are these mythical people who aren’t concerned with both?

      They're called politicians.

  • Monitoring children's DMs is the responsibility of the parents, not megacorps. If a parent wants to install a keylogger or screen recorder on their child's PC, that's their decision. But Google should not be able to. Neither should... literally anyone else except maybe an employer on a work-provided device.

    • > Monitoring children's DMs is the responsibility of the parents, not megacorps

      Absolutely. But what responsibilities do megacorps have? Right now, everyone seems to avoid this question, and make do with megacorps not being responsible. This means: "we'll allow megacorps to be as they are and not take any responsibilities for the effects they cause to society". Instead of them taking responsibilities, we're collecting everyone's data and calling it a day by banning children from social networks... and this is because there are many interests involved (not related to child development and safety).

      12 replies →

    • The simplest way that can work is for the child account to be linked to a parent account, and the parent account can see the child account's DMs.

    • I also think children do/should have a right to privacy and their parents do not have to know everything.

      Kids should be able to write a journal or talk to friends with total trust that this information will not reach their parents.

    • > Monitoring children's DMs is the responsibility of the parents, not megacorps.

      Yup, but the tools provided make that easy or hard.

      But putting that emotive bit to one side, Megacorps have a vested interest in not being responsible to children. They need children's eye balls to drive advertising revenue. If that means sending them corrosive shit, then so be it.

      Its a bigger issue than encryption, its editorial choice.

    • Mega corps should be compelled to and rewarded for allowing parents to monitor their children’s dms.

    • I'm all for helping parents to do this. Any site requiring age verification should indicate this as a http header or whatever, and the browser I allow my child to use should respect that and the parental controls should be easy for me to engage with

      Many parental controls are massive pains to get working. Apple does fairly well (although I don't get a parental pin number to unlock the phone, which is normally fine as my child will tell me, but in some circumstances it wouldn't be), but does require the parent to be on the apple ecosystem too.

      EA and Microsoft however are terrible, especially as it's likely the child will be playing fortnite/minecraft and the parent won't have ever touched it. I think with minecraft we had to make something like 5 or 6 accounts across three different sites to allow online minecraft play from a nintendo switch.

    • Parents shouldn't give their child access to a device that allows DMs.

      That said, these platforms are making it impossible for parents to monitor anything. They're literally designed to profit off addiction in children.

      2 replies →

  • > Age verification should be banned

    Why?

    > They already got so much data on their users

    There are a variety of ways (see "Verifiable Credentials") that ages can be verified without handing over any data other than "Is old enough" to social media services.

    • Age verification obliviates anonymity on the internet. If everything you do, _can_ be tracked by the government, it _will_ be.

      Allowing for more effective propaganda, electrol control, and lights a fire on the concept of a government _representing_ anyone.

      18 replies →

    • The problem with this discussion is that this is a wonk solution for wonkish times. You're trying to thread the needle between various reasonable compromises. Ironically due to social media, that is simply not how politics and lawmaking works any more. Instead it's an emotionally driven fight between various different sorts of moral panic, and the only option is to get people more mad about surveillance than "think of the children".

      You might be able to get somewhere by getting a tech company on your side, but they generally also hate adult content and don't mind banning it entirely.

      (people are not going to get age verification _banned_ any time soon! That's simply not going to happen!)

    • It's a slippery slope.

      This is the next two steps into 1984.

      Once you start mandating this, there's no going back.

      The next generation will start associating wrongthink with government IDs. (Wait, we already do that, right?)

      5 replies →

TikTok is a front for government surveillance, so it's not really surprising that this is their position.

"makes users less safe"

They don't believe that. It makes it more difficult to deal with governments, is all. Big Brother needs your messages from time to time, and TikTok is not willing to risk getting shut down to argue against that. We can't have pesky principles getting in the way of money.

I don’t really understand how we are supposed to believe in e2ee in closed proprietary apps. Even if some trusted auditor confirms they have plumbed in libsignal correctly, we have no way of knowing that their rendering code is free of content scanning hooks.

We know the technology exists. Apple had it all polished and ready to go for image scanning. I suppose the only thing in which we can place our faith is that it would be such an enormous scandal to be caught in the act that WhatsApp et al daren’t even try it.

(There is something to be said for e2ee: it protects you against an attack on Meta’s servers. Anyone who gets a shell will have nothing more than random data. Anyone who finds a hard drive in the data centre dumpster will have nothing more than a paperweight.)

  • The unfortunate fact about E2EE messaging is that it is hard to do. Even if you do have reproducible builds, the user is likely to make some critical mistake. What proportion of, say, Signal users actually compare any "safety numbers" for example? There is no reason to worry about software integrity if the system is already insecure due to poor usability.

    Sure, we should all be doing PGP on Tails with verified key fingerprints. But how many people can actually do that?

  • I've been making this argument for a long time, and it's never popular.

    People want to believe in E2EE, it's almost like religion at this point.

    Protecting people is synonymous with E2EE, even if you cant verify it, and it can be potentially broken.

    I was even more controversial and singled out Signal as an example: https://blog.dijit.sh/i-don-t-trust-signal/

    • Same, my default MO is assuming 'e2ee' is broken and unsafe by default. Anything that I truly don't want sent over the wire would be in person, in the dark, in a root cellar, underwater. Not that I've ever been in the position to relay juicy info like that. Hyperbole I know, but my trust begins at zero.

  • With e2ee please remember that it is important to define who are the ends.

    Perhaps your e2ee is only securing your data in travel if their servers are considered the other end.

    Also one thing people seem to misunderstand is that for most applications the conversation itself is not very interesting, the metadata (who to who, when, how many messages etc.) is 100x more valuable.

Why would you use TikTok for private communications anyway? It's mostly a public short video sharing platform.

> the controversial privacy feature used by nearly all its rivals

"controversial" according to who? The NSA / GCHQ?

  • Listed in the article are the National Society for the Prevention of Cruelty to Children and the Internet Watch Foundation, which monitors and removes child sexual abuse material from the internet.

    The recent Meta lawsuits also mention opposition from the National Center for Missing and Exploited Children and Meta's own executives: Monika Bickert (head of content policy) and Antigone Davis (global head of safety). Both executives mention the danger end-to-end encryption poses to children when attached to a social media graph.

    https://www.reuters.com/legal/government/meta-executive-warn...

    • > Both executives mention the danger end-to-end encryption poses to children when attached to a social media graph

      So the fact that we welded a messaging platform onto a global-child-discovery-service is bad? Sure. Not encrypting that messaging platform is sort of closing the barn door after the horse has gone walkabout

      2 replies →

    • Good to see this called out. The HN echo chamber has this really terrible habit of attributing any disagreement with the prevailing opinion here to big, shadowy forces with evil motives (billionaires, corporations, three letter agencies, politicians, etc) instead of facing the reality that sometimes well meaning people just have different values and priorities than us. Very rarely does that narrative get challenged directly.

It doesn't matter. Web-based cryptography is always snake oil

https://web.archive.org/web/https://www.devever.net/~hl/webc...

  • > if the server operator was malicious, they could just push different client-side JavaScript

    Same as with OS updates, browser updates, dependencies used by the OS, dependencies used by the browser. Also you can run malicious software such as keyloggers and you're compromised.

    That argument doesn't mean E2E (even web based) is snake oil. Browsers just give you more points of failure.

    • The difference is: in web based cryptography, you get the cipher text and the code to decrypt it from the same source. Hijacking OS updates is arguably much harder than hijacking one particular web server, and there is pretty much no effective defense against malicious OS updates.

      1 reply →

  • Agree, but a significant point missed in the article is that of data vulnerability. with E2EE the company db is useless to an external attacker.

    For some companies (eg facebook, google, tiktok) i would be mostly worried about the company itself being untrustworthy. For others I would be mostly worried about the company being vulnerable.

    • > with E2EE the company db is useless to an external attacker.

      Depends on who is defined as the other end, it may be that the company db is the other end.

  • It's a native app what are you talking about

    • > It is worth noting that this law also applies to non-web applications where the service provider supposedly being secured against is also the client software distributor; thus, the “end-to-end encryption” offered by Whatsapp and Signal, amongst other proprietary services, is equally bogus. (Both Whatsapp and Signal ban use of third party clients, and enforce this policy.)

    • the specificity of between web apps that is highlighted by the article is that you receive a bundled code of software every time you open or use the app, as opposed to say, the operating system or desktop apps, which are less frequently updated. (Native) mobile apps are like web apps in that they release updates almost every day.

TikTok and other social media apps' business models are antithetical to privacy.

  • Their whole model predicates on the lack of privacy, so it’s crazy to expect anything else.

In my opinion, a separate software should be used for the end-to-end encryption than for the communication, although there are other things to do for security other than only programming the computer correctly (such as securely agreeing the keys and ciphers in person).

Since when is E2EE controversial? Not using E2EE should be controversial.

  • It's never been controversial, it's the BBC. doing it's usual job of laundering the arguments the establishment want you to hear for domestic consumption.

    • The thing is, it _is_ controversial. At least amongst the general public.

      Obviously not in somewhere like Hacker News where there’s a clear consensus, but if you asked a random sample of the UK population “should law enforcement be allowed to compel tech companies to hand over all DMs of confirmed paedophiles?”, I’d bet very good money the majority would say “yes”.

      The notion that “Big Tech” can absolve themselves of the responsibility to help law enforcement find child abusers by saying “it’s all encrypted, not my problem”, does not sit well with a large sector of the population.

      Whether it’s good or bad is an ultimately political question, and both sides of the debate tend to talk past each other on this topic, but it’s undeniably a controversial point within the broader population.

      3 replies →

lol

It makes sense - they extract every possible bit of personal information from your device - why would they make you believe they care about your privacy?

You want to communicate privately? TikTok is not the place, and that’s ok. shrugs

People seriously discuss privacy in Chinese app . With all respect, their government will not allow you even a hint of privacy

The core tension here isn’t really about encryption itself, it’s about moderation models.

Most large platforms rely heavily on server-side visibility for abuse detection, spam filtering, recommendation systems, and safety tooling. End-to-end encryption removes that visibility by design. Once a platform is built around centralized analysis of user content, adding strong E2EE later isn’t just a feature toggle — it conflicts with large parts of the existing architecture.

Do you feel safer knowing DMs are not encrypted?

A middle ground would be to implement E2EE but have messages signed (and ideally organized in a Merkel tree), so that if a DM is reported there's cryptographic proof that the accounts sent the messages.

I dont think the argument is really about child safety. If it was tiktok would also be working on fixing their algorithm that can send minors toward harmful content, which is a far larger documented vector than encrypted DMs. This is about preserving access.

Fun fact - there is a big correlation between World Wars and compulsory education. Of course governments and big corporations "care" about children. Of course!

I feel like this makes sense for a platform that targets teens. Plus, I wouldn't trust TikTok to implement E2E encryption properly—who knows what they've snuck into their client.

  • What kind of application is not targeted at both teens and adults?

    Youtube, twitter, bluesky, whatsapp? Every app with a social aspect will be used by teens. And no, tiktok is not "only for teens" or "specially targeted at teens", nowadays everyone uses it and creates content on it.

    • Came here to post this.

      If you run (say) a restaurant, you get big spikes in business from TikTok videos in ways you don't get from Facebook or Instagram or others.

      TikTok is the platform everyone is one right now.

  • I think it's very safe to assume that no major US based platform has 'real' E2E encryption. They're almost certainly all a part of PRISM by now, and it'd contradict their obligations to enable government surveillance. So the only thing that's different is not lying about it. Though I expect the other platforms are, like when denying they were part of PRISM, telling half truths and just being intentionally misleading. 'We provide complete E2E encryption [using deterministically generated keys which can be recreated on demand].'

There is no way to do E2EE on a traditional social media platform with user-generated content and comply with existing US law.

You can’t moderate an E2EE platform.

  • All of Meta’s major properties (Messenger, Instagram, WhatsApp) support E2EE messaging.

    • Pretty sure that for Meta the impossibility to moderate E2EE was the point. It’s cheaper to shrug than pay content moderators.

  • Aside from the fact that you can get Metadata and that some communication frequently happens outside of E2EE - what US law do you believe mandates moderation? I'm curious.

That's good, people who need E2EE shouldn't use TikTok either way, there are plenty of other secure apps for that.

TikTok is a social media app, and it gets heavily abused as it is.

Making users less safe from… letting us snoop on all your communications for “national security”.

unrelated but I'm always surprised by the number of people who don't know that instagram dms are not encrypted by default.

I see it like this: Taking in the totality of the danger, they're right. If the source (social network) and the destination (child brain) cannot be treated as trustworthy, then you must control the content for overall safety. If you could trust either end, then you could dismiss the argument. But you cannot trust children to be cognizant of abuse, and you already know social media literally reinvented abusive behaviors for the 21st century. Do nothing and children will be harmed. Overreach by any amount and you have destroyed freedom. The only middle ground is weaker encrypted E2E comms. Something that creates a forcing function with very high cost (an electric bill or SaaS service) for the sniffer but can be broken with enough horsepower. Think about what millions of dollars per character would do. Good luck codifying that insane compromise into a law.

I'll never let my kids have a TikTok account anyway (once they're adults they can have one of course if they want to).

> But critics have said E2EE makes it harder to stop harmful content spreading online, because it means tech firms and law enforcement have no way of viewing any material sent in direct messages.

Like they give a damn. I report accounts that explicitly sell fake credit cards, citing laws that make it illegal and 95% of the time "we checked and there is no violation here, we know that you're not happy but don't give a crap".

So the argument of security is utter bullshit and they just want to snoop.

I hate the BBC so much - "controversial privacy tech" "E2EE ... the best way to protect conversations from .. even repressive authorities" "End-to-end encryption has been criticised by governments, police forces"

They're saying this at the same time as they're clutching pearls over Iran's repression of protestors. Typical of the ethical consistency I would expect from them.

Reminder, Larry “citizens shouldn’t get any privacy” Ellison now owns tik tok. If you’re still using it or have friends and family using it you should stop immediately. It WILL eventually be used against you if this regime gets its way.

https://digitaldemocracynow.org/2025/03/22/the-troubling-imp...

The actual headline is currently

> TikTok won't protect DMs with controversial privacy tech, saying it would put users at risk

Not sure if this was changed since first posting, I don't mind updates, but unless it'd redacting for legal purposes (which should then itself be clearly mentioned), the BBC should provide a public changelog like wikipedia

A Chinese company saying you don't need encryption. Why should anyone waste time debunking their bad faith "arguments"?

TikTok’s stance against end-to-end encryption is unsurprising but still concerning. TikTok is a source of information on many topics, such as the genocide in Gaza, which traditional media underreport and many governments try to suppress. The network effect of big social media platforms means many people will likely talk about these topics in TikTok DMs. No matter what legal controls TikTok claims to enforce, there is no substitute for technological barriers for preventing invasions of privacy and government overreach. This is yet another example where corporations and governments sacrifice people’s autonomy and privacy in the name of security.

  • It's a pretty terrifying world we live in now, where an unencrypted addictive short-form video platform is considered a source of information more than news agencies or even community-managed forums.

    • For older generations Facebook has the same problem. "On Facebook it said [propaganda item bla bla]" is something I hear with those generations.

"The situation is made more complex because TikTok has long faced accusations that ties to the Chinese state may put users' data at risk."

And yet, it's even more complex than that, since it's now owned by cronies of the current US President. I've never had a TikTok account, but conceptually I was mostly pretty okay with being spied-upon by China. I'm never going to China.

  • > I'm never going to China.

    China will come to us.

    Or should that be:

    China will come to the US.

  • > "I'm never going to China."

    Voluntarily.

    • Yes. China gives a shit that user rdiddly, at 36 minutes before 00:55 UTC on March 4, 2026, said that China is spyihg to the point that they are going to be abducted for it.

It's one thing to make a policy decision I disagree with. It's another to lie, blatantly, to my face about it. But what do you expect from people who bought TikTok specifically so they could add censorship and lied about it being some kind of national security issue?

This according to many researchers is the best case study example for corporations gaslighting users into accepting surveillance by companies and governments alike.

> Grooming and harassment risks are very real in DMs [direct messages] so TikTok now can credibly argue that it's prioritising 'proactive safety' over 'privacy absolutism' which is a pretty powerful soundbite

Means they read every message

why are we still wringing our hands around this? we’ve already determined that tiktok is bad for our health.

because tiktok is addicting, and they know it…

BBC calling encryption "controversial privacy tech" is deeply disappointing and dangerous.

  • I wondered how it could be considered 'controversial', but they do quote at least a couple groups speaking against it. The NSPCC for instance, who incidentally also warned parents about a Harry Potter video game because their children might want to learn more about the game:

    >“Parents should also be aware that players may want to find out more about the game using other platforms such as YouTube, Twitch, Reddit and Discord, where other game fans can discuss strategies and experiences.

  • It is controversial.. amongst people who have concerns about private communications and society, from a regulatory and governance perspective.

    It's uncontroversial amongst people who value their privacy.

    The tension between the two camps (there are obviously nuances and this is a false dichotomy) is at a current peak. It's an ongoing controversy. It's a matter of public debate.

    You might have liked it better if the angle had been "...which the government, controversially, wants to clamp down on" or something.

  • Calling something controversial is a favorite propaganda technique employed by "news" outlets. It's another form of selective reporting and framing. It carries negative connotations, and has really no objective standard by which it can be wrong since you'll always find somebody against any issue.

    After you notice it, you'll notice it everywhere.

    • > It carries negative connotations

      Interesting I'm not a native English speaker but in news articles I have always interpreted "controversial" as meaning "under discussion" (perhaps even around a 50/50 divide) hence why they are writing an article about it.

      I feel it is the news outlet trying to justify why the topic is important to read about since most people reading it will interpret the issue at hand as having a "common" stance. Usually it is used in topics that are very binary, for or against.

      2 replies →

  • The UK government seems a lot more willing to embrace the panopticon in the name of protecting people from terrorists, child sex traffickers, human rights activists, Catholics, jaywalkers, you name it.

> We know just how risky end-to-end-encrypted platforms can be for children

As opposed to doomscrolling and brainrot, which are not risky to expose children to at all. /s

If TikTok cared about children in the slightest, they would not exist.