← Back to context

Comment by 331c8c71

1 year ago

I am null at cryptography but thie following does not sound too bad as a default tbh. And I think it is misleading to focus solely on e2ee and not mention the distributed aspect.

https://telegram.org/faq#q-do-you-process-data-requests

> To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions. The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.

> Thanks to this structure, we can ensure that no single government or block of like-minded countries can intrude on people's privacy and freedom of expression.

> Telegram can be forced to give up data only if an issue is grave and universal enough to pass the scrutiny of several different legal systems around the world.

> To this day, we have disclosed 0 bytes of user data to third parties, including governments.

You can coherently argue that encryption doesn't matter, but you can't reasonably argue that Telegram is a serious encrypted messaging app (it's not an encrypted messaging app at all for group chats), which is the point of the article. The general attitude among practitioners in the field is: if you have to reason about how the operator will handle legal threats, you shouldn't bother reasoning about the messenger at all.

  • [flagged]

    • > you can install a reproducible build of Telegram and be sure it's end-to-end encrypting things.

      This is incorrect. The construction for group chats in Telegram is not e2e at all. The construction for dm’s is considered dubious by many cryptographers.

      It does not matter if you can reproduce a non-e2e encrypted message scheme, you must still trust the servers which you have no visibility on.

      Trustworthy e2e is table stakes for this for that reason. Reproducible builds aren’t because we can evaluate a bunch of different builds collected in the wild and detect differences in implementation. This is the same thing we’d do if reproducible builds were in effect.

      There are lots of reasons splitting jurisdictions makes sense but you wrote a whole bunch of words that fall back to “hope Telegram doesn’t change their protections in the face of governmental violence”.

      5 replies →

    • > On the question of balancing privacy and security, there are in fact solutions, but you have to get away from the idea of a centralized police force / centralized government, and think in terms of a free market of agencies, that can decrypt limited evidence only with a warrant and only if they provide a good reason. The warrant could be due to an AI at the edge flagging stuff, but the due process must be followed and be transparent to all

      What does this mean? How can "we" move away from centralized states to "a free market of agencies"? How can there be a "market" of police forces, even in principle? Who are the customers in this imagined market? Who enforces the laws to keep it a free market?

      At first glance, this sounds like libertarian fan fiction, to be honest, but I am curious.

      3 replies →

  • > if you have to reason about how the operator will handle legal threats, you shouldn't bother reasoning about the messenger at all.

    That's true.

    You need to run your own platform people. XMPP is plenty simple, plenty powerful, and plenty safe -- and even your metadata is in your control.

    Just self host. There's no excuse in 2024.

    Wake up people!

    Why should the arrest of someone else affect YOU?

    • "You need to run your own platform people." What problem does this solve?

      I'm someone who's been on the business end of a subpoena for a platform I ran, and narcing on my friends under threat of being held in contempt is perhaps the worst feeling I'm doomed to live with.

      "XMPP is ..." not the solution I'd recommend, even with something like OMEMO. Is it on by default? Can you force it to be turned on? The answer to both of those is, as it turns out, "no," which makes it less than useful. (This is notwithstanding several other issues OMEMO has.)

      6 replies →

    • As if it were that simple. Where are you going to host that self-hosted instance? What protections against law enforcement inspections do you have? What protections against curious/nefarious hackers? How are you going to convince every single person you interact with to use it?

      Gung-ho evangelists rarely convert like a reasonable take on the subject does

    •   > Just self host. There's no excuse in 2024.
      

      I hate to break it to you, but there's plenty of excuses. We live in a bubble on HN.

      May I remind you what the average person is like with this recently famous reddit post:

      https://archive.is/hM2Sf

      If you want self hosting to happen, with things like Matrix, and so on, the hard truth is that it has to not be easy for someone who can program, but trivial for someone who says "wow, can you hack into <x>" if they see you use a terminal

    • You're assuming end-to-end encryption doesn't exist, and that the only way to be safe is to have someone close to you self-hosting.

      Self-hosting is terrible in that it gives Mike, the unbeknownst creepy tech guy in the group 100% control over the metadata of their close ones. Who talks to whom, when etc. It's much better to either get rid of that with Tor-only p2p architecture (you'll lose offline-messaging), or to outsource hosting to some organization that doesn't have interest in your metadata.

      The privacy concern Green made was confidentiality of messages. There is none for Telegram, and Telegram should have moderated content for illegal stuff because of that. They made a decision to become a social media platform like Facebook, but they also chose not to co-operate with the law. Durov was asked to stop digging his hole deeper back in 2013, and now he's reaping what he sow.

Yes: End-to-end encryption is technically quite difficult, but politically and legally feasible (at least currently, at least in most countries).

Simply not cooperating with law enforcement is technically moderately difficult, but politically and legally impossible.

Between a difficult and an impossible option, the rational decision is to pick the difficult one.

  • Indeed. Even being charitable and assuming that they're not lying (they say elsewhere that they've shared zero bytes with law enforcement, despite this being demonstrably false), in reality if say, they were to arrest the founder in an EU country (France, perhaps), all they need to do is threaten him with twenty years in prison and I'm sure he'll gladly give up the keys from all the different locations they supposedly have.

  • Is there a nice solution for multiparty (n >= 3) end-to-end encryption?

    • A possible implementation using existing infrastructure where at least the client is open: modify the messaging client so that when it receives multiple pvt connections it routes every incoming message to all connected members. Now if you have say 10 users that want group encrypted chats, have one of them run the modded client too so that any user connecting to a pvt chat with that client will essentially enter a room with other users. Of course this requires trust between members, and adding another encryption layer on all clients might turn out necessary so that you don't need to worry about the carrier telling the truth (all p2p connections encrypted, etc)..

    • Have the room owner create an AES 256 key, send it to all Party members via 1:1 e2ee, encrypt room messages with that AES key.

      4 replies →

I wonder if this is practically relevant at all.

Given that users can access their messages without interaction with people at Telegram, automatic aggregation of the cloud data for single end points is in place.

In consequence the data can be accessed from a single jurisdiction anyways.

  • Wouldn’t being forced to give up the password and logging in be a violation of the 5th amendment, at least in the US? I think it’s a mixed bag of rulings right now, but it seems like it would make sense for it to fall that way at the end of the day.

    • even if you have a password in Telegram as a second factor, Telegram can bypass it anyways; and the user isn't even asked

The problem with this approach is that it relies on governments accepting your legal arguments. You can say "no, these are separate legal entities and each one requires a court order from a different country" all you want, but you also need to get the courts themselves to agree to that fact.

Problem with this claim is that it's hardly verifiable. Telegram's backend is closed source, and the only thing you can be sure of is that their backend sees every message in plaintext.

  • [flagged]

    • Crypto is really hard. You have to trust that whoever implemented the crypto is smart and diligent, and you have to trust that whoever operates the crypto is smart and diligent, and you have to trust both of those parties.

      Centralization means that it's very easy to trust that whoever implements and operates the crypto is smart. Do I trust them? I don't know. I trust myself, but I don't think I am independently capable of operating or implementing crypto - if I want to make assertions like "this is end-to-end-encrypted" and ensure those assertions remain true, I will need a several million dollar a year budget, at a minimum. "Decentralized" means you've got tons of endpoints that need securing, and they can share crypto implementations, but the operations are duplicated. Which means it's more expensive, and you're trusting more operators, especially if you want resiliency.

      Yes, something like Signal or Whatsapp means you've got a single point of failure, but something like Matrix, you've got many points of failure and depending on how it's configured every point of failure can allow a different party to break the confidentiality of the system.

      Decentralization is great for resiliency but it actively works against reliable and confidential message delivery.

      1 reply →

    • What do web3 and crypto moneys have anything to do with the discussion?

      Decentralized protocols have existed for a very long time. Email have existed since the 70s. Telephone is also arguably decentralized and have existed for even longer.

      9 replies →

    • > Many people on HN silently downvote anything that has to do with crypto and decentralization.

      I primarily downvote them because I haven't seen anything come out of that space that seems like it's remotely capable of actually achieving decentralization (for which I also see a dire need in today's structure of the Internet and the applications running on it).

      95% of the time, these things are built as a Potemkin village of technical decentralization backed up by complete administrative centralization, with the path to actual decentralization "very high on our public roadmap available here, we promise!!!"

Maybe hijack the key and message before it gets distributed. Or just get after the pieces themselves if they are from Chinese or Russian authorities. Or just threaten to close the local data center if they do not collect the pieces from elsewhere, see if they can be convinced to hand over what they have, regardless where they put it.

We can be null in cryptography, but handing over both the secret and the key to this secret to the very same person is quite a trustful step, even when they say 'I promise I will not peek or let others peek, pinky promise!' - with an 'except if we have to or if we change our mind' in the small prints or between the lines.

https://www.spiegel.de/netzwelt/apps/telegram-gibt-nutzerdat...

> Translated: Contrary to what has been publicly stated so far, the operators of the messenger app Telegram have released user data to the Federal Criminal Police Office (BKA) in several cases.

https://torrentfreak.com/telegram-discloses-user-details-of-...

> Telegram has complied with an order from the High Court in Delhi by sharing user details of copyright-infringing users with rightsholders.

Anyways just some examples in which their structure doesn't matter. In the end, user data is still given away. It's also why e2ee should be the sole focus. Everything else is "trust me bro it's safe" levels of security.

>To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions.

This is utter bullshit I debunked back in 2021.

https://security.stackexchange.com/questions/238562/how-does...

  • In practice also didn't work, only one government was needed to arrest the guy. And now all they need is a hammer or some pliers. No need for multiple governments to coordinate.

    • Well I'm sure France isn't taking Durov to some black site at this point. But since there's no such thing as distributed computation of single AES block operation, each server must by definition have access to the server's SQL-database key, and that key can be confiscated from which ever node is interacting with the database. Last I heard the servers in EU were in Netherlands, so if needed, perhaps the authorities there will handle it after court proceedings.

> The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.

Or the CEO and owner, staring down the barrel of a very long time in prison, obtains the keys from his employees and provides them to the authorities.

Would he do this? To me, it matters little how much I trust someone and believe in their mental fortitude. I could instead rely on mathematical proofs to keep secrets, which have proven to be far better at it than corporations.

Splitting stuff between multiple companies doesn't really protect anyone if the boss of all companies is held hostage.

Also

> To this day, we have disclosed 0 bytes of user data to third parties, including governments.

Didn't they conclude an agreement with Russian gvt in 2021?

Clearly the investigating authorities are not buying that argument because, well, it's completely absurd. Both technically and legally, Telegram are in control of those keys, regardless of where they are hosted.

> Telegram can be forced to give up data

That's all you need to know. Matrix and Signal can't be forced in any way.

That’s Telegram's CEO saying how he and his employees were “persuaded and pressured” by US FBI agents to integrate open-source libraries into Telegram (1).. There are a lot of questions to ask, like if the open-source libraries are indeed compromised, among other things. I take it as this arrest was the final straw to pressure him to give up and hand over some “needed” data, as all the accusations I read are laughable. Instagram is full of human trafficking and minor exploitation, drug dealers, and worse. The same goes with other social media, and I don’t see Elon or Zuck getting arrested. I am confident that this arrest is to obtain specific information, and after that, he will be released, or spend 20 years if he doesn’t comply.

(1) https://youtu.be/1Ut6RouSs0w?t=1082

  • Or he's trained in the art of lying

    "At St. Petersburg State University, Mr. Durov studied linguistics. In lieu of military service, he trained in propaganda, studying Sun Tzu, Genghis Khan and Napoleon, and he learned to make posters aimed at influencing foreign soldiers."

    https://www.nytimes.com/2014/12/03/technology/once-celebrate...

    You really think the FBI would casually go to Durov and start telling him which libraries to deploy in his software.

    This "They're trying to influence me that means its working" 5D-chess is the most stupid way to assess security of anything.

    There's nothing to backdoor because it's already backdoored:

    Code does not lie about what it does. And Telegram clients' code doesn't lie it doesn't end-to-end encrypt data it outputs to Telegram's servers. That's the backdoor. It's there. Right in front of you. With a big flashing neon light says backdoor. It's so obvious I can't even write a paper about it because no journal or conference wouldn't accept me stating the fucking obvious.

I do wonder if this would hold up though, if telegram stored each character of your chat in a different country, would a single country not be able to force them to hand over the data and either fine them or force them to stop operating if they wouldn't share the full chat? It seems like a loophole but I don't know what the precedent is.