And it only works because a corporation likely would want to offer this to its users as a convenient feature. If they were actively trying to hide this, they can rig the test and keep access to themselves.
> Does Telegram let them see it: I don't think so.
This is exceptionally naive. Even if he was arrested for not sharing with the French, what about for other countries? Was he arrested for not ever sharing or not sharing enough? Even if he, personally, has never shared, that doesn’t say anything about his employees who have the same access to these systems.
Your data is not private with Telegram. You are trusting Telegram. It is a trust-based app, not a cryptographically secure app.
If you trust telegram, that’s your choice, but just because a person says the right words in interviews doesn’t mean your data is safe.
Telegram is the only messaging app that I know of which brought attention to the fact that your messages go through Google/Apple notification APIs, which seems like it would utterly defeat any privacy advantage offered by E2EE
> Does Telegram let them see it: I don't think so. That seems to be the core issue style Durov being arrested
The UAE requires decryption keys as part of their Telco regulations.
If Telegram can operate in the UAE without VPN (and it can), then at the very least the UAE MoI has access.
They (and their shadow firms like G42 and G42's shadow firms) were always a major buyer for offensive capabilities at GITEX.
On that note, NEVER bring your personal phone to DEFCON/Blackhat or GITEX.
Edit: cannot reply below so answering here
Cybersecurity conferences.
DEFCON/Blackhat happen during the same week, so you have a lot of script kiddies who lack common sense trying to pwn random workloads. They almost always get caught (and charged - happens every year), but it's a headache.
GITEX is MENA and Asia's largest cybersecurity conference. You have intelligence agencies from most of the Middle East, Africa, Europe, and Asia attending, plus a lot of corporate espionage because of polticially connected MSSPs as well as massive defense tenders.
AFAIK this current case has absolutely nothing to do with any form of chat features, it’s about telegram’s public channels that more or less work like reddit/twitter/any other news channels, except it refuses to censor content.
All the encryption stuff is just a red herring to a larger degree. It’s not the technical access to the information that is the issue, it is that people can share and exchange information that the various regimes do not want shared that is the primary issue. They want censorship, i.e., control of thought and speech, arresting the information flow.
They know what is being said and that’s what they want to arrest, that information can be sent and received. And by “they” I mean more than just the French. That was just coincidental and pragmatic.
The French state does not operate that quickly on its own, to get an arrest warrant five minutes after he landed and execute on it immediately. That has other fingerprints all over it in my view.
> They probably should implement E2EE for everything
Certainly not because then Telegram would lose alot of its functionality that makes it great. One thing that I really enjoy about Telegram is that I can have it open and synched across many independent devices. Telegram also has e2e as an option on some clients which cant be synched
they probably share it with russian authorities. Just look now. russia is allowing protests in favour of him (they only allow protest they support) and they arrested a french citizen on fake drug charges right after
Do you have some info about Durov being arrested for not letting law enforcement see encrypted messages? The public info says he was arrested for "...lack of moderation, ...[and] failing to take steps to curb criminal uses of Telegram."
I don't see anywhere saying he's been arrested for anything to do with encryption or cooperating with investigations.
Well of course, but this is a feature of Telegram. It's the only messaging app where messages are stored on the cloud. This of course has security implications, but also allows you to have a big number of chats without wasting your device memory like WhatsApp does, or having to delete old conversations, and allows you to access your chats from any device. By the way you can also set a password to log in from another device (two factor authentication, also on WhatsApp now you have this option).
To me it's a good tradeoff, of course I wouldn't use Telegram for anything illegal or suspect.
> It's the only messaging app where messages are stored on the cloud.
Besides Slack and Discord and Teams and whatever the heck Google has these days and iMessage and...
I think you mean it's the only messaging app that purports to have a focus on security where messages are stored in the cloud, which is true, but also sus. There's a reason why none of the others are doing it that way, and Telegram isn't really claiming to have solved a technical hurdle that the E2E apps didn't, it's just claiming that you can trust them more than you can trust the major messaging apps.
Maybe you can and maybe you can't, the point is that you can't know that they're actually a safer choice than any of the other cloud providers.
But that's literally the entire point of this article. That is, in this day and age, when people talk about "secure messaging apps" they are usually implying end-to-end encryption, which Telegram most certainly is not for the vast majority of usages.
This is such a misrepresentation. Telegram could at-will feed the cloud-2FA password to password hashing function like Argon2 to derive a client-side encryption key. Everything could be backed up to the cloud in encrypted state only you can access. Do they do that? No.
So it's not as much as trade-off, as it is half-assed security design.
That's it. The article could be just that. You log back in and all your messages are there without you having to provide a secret or allow access to some specific backup? Your data just lives on the server. The only thing preventing anyone from accessing it is the goodwill of the people running the server.
Not true. Secret chats only live on a device where you started it. Regular people may not use them (their problem), but these are common for business-critical chats in my circles.
Indeed and this is the other thing - even if Telegram don't themselves co-operate with law enforcement, it'd be fairly easy for law enforcement to request access to the phone number from the carrier, then use it to sign into the Telegram account in question and access all of the messages.
You can set a password that’s required to authenticate a new device.
Once that’s set, after the SMS code, then (assuming you don’t have access to an existing logged in device because then you are already in…), you can either reset the password via an email confirmation _or_ you can create a new account under that phone number (with no existing history, contacts, etc).
If you set a password and no recovery email, there is no way for them to get access to your contacts or chat history barring getting them from Telegram themselves.
I upload encrypted backups to a cloud service provider (AWS, Google Cloud). I go to another computer, download them, use a key/password to decrypt them.
Sure, I get it, you're typing in something that decrypts the data into their app. That's true of all apps including WhatsApp, etc... The only way this could really be secure is if you used a different app to the encryption that you wrote/audited such that the messaging app never has access to your password/private key. Otherwise, at some point, you're trusting their app to do what they claim.
The previous poster intentionally mentioned password recovery flow. If you can gain access without your password, than law enforcement can too. If you could only gain access with your password, you could consider your data safe.
Offhand, this sounds like a terribly insecure workflow but...
Client creates a Public Private key pair used for E2EE.
Client uses the 'account password (raw)' as part of the creation of a symmetric encryption key, and uses that to encrypt and store the SECRET key on the service's cloud.
NewClient signs in, downloads the encrypted SECRETKeyBlob and decodes using the reconstructed symmetric key based on the sign in password. Old messages can then be decoded.
-- The part that's insecure. -- If the password ever changes the SAME SECRET then needs to be stored to the cloud again, encrypted by the new key. Some padding with random data might help with this but this still sounds like a huge security loophole.
-- Worse Insecurity -- A customer's device could be shipped a compromised client which uploads the SECRET keys to requesting third parties upon sign-in. Those third parties could be large corporations or governments.
I do not see how anyone expects to use a mobile device for any serious security domain. At best average consumers can have a reasonable hope that it's safe from crooks who care about the average citizen.
I know this is getting off-topic, but all the discussion about encryption missing an important weakness in any crypto algorithm - the human factor.
I found it interesting that countries like Singapore haven’t introduced requirements for backdoors. They are notorious for passing laws for whatever they want as the current government has a super majority and court that tends to side with the government.
Add on top Telegram is used widely in illegal drug transactions in Singapore.
What’s the reason? They just attack the human factor.
They just get invites to Telegram groups, or they bust someone and force them to handover access to their Telegram account. Set up surveillance for the delivery and boom crypto drug ring is taken down. They’ve done it again and again.
One could imagine this same technique could be used for any Telegram group or conversation.
You already know how Signal is going to come out here, because this is something people complain incessantly about (the inconvenience of not getting transcripts when enrolling new devices).
Also the same with Skype "encryption". The data is "encrypted", but you receive the private key from the server upon sign-on... So, just need to change that password temporarily.
- locally create a recovery key and use it to wrap any other essential keys
- Split that or wrap that with two or more keys.
- N - 1 goes to the cloud to be used as MFA tokens on recovery.
- For the other, derive keys from normalized responses to recovery questions, use Shamir's secret sharing to pick a number of required correct responses and encrypt the Nth key.
You can recover an account without knowing your original password or having your original device.
Unless you can prove (e.g. using your old device or a recovered signing key) that the new device is yours. In that case, if the service supports it, the new device could automatically ask your contacts to re-send the old messages using the new device's public key.
Telegram has secure calls and secure e2e private chats.
All other chats are cloud-backupped.
So if you have an intent of using private communication - the answer is "no", if you don't care - the answer is "yes"
Why not the "founder locked up" test? If the founder claims secure encryption, yet they are not in jail, that means there's no secure encryption because they negotiated their freedom in exchange for secret backdoors.
Maybe, but not a good litmus test. If it’s truly secure and the founder can’t provide information because they don’t have access to it it’s also possible they can’t build a case in most countries.
That isn’t applicable here. Telegram isn’t encrypted and yet they refused to comply with subpoenas. Companies whose customer data is encrypted can truthfully say that they have no way to access it for law enforcement. Telegram can’t.
Maybe in the future, creators of encrypted messaging apps will get locked up. I certainly hope not. But this case doesn’t indicate anything one way or another.
Yeah, and the only way to get government to learn about why e2ee is important is to show them that if law enforcement can get it, then so can hackers/phishers. We need as many politicians dark secrets hacked and ousted as possible. It should be a whistblower protected right codified into law to perform such hacks
In my opinion, Telegram is more of a social network than a messenger. There are many useful channels and in many countries, it plays an important role in sharing information. If we look at it from this point of view, e2ee does not seem very important.
We should also not forget that, in the time when all social media (Reddit, X, Instagram etc.) close their APIs, Telegram is one of the only networks that still has a free API.
That's the dangerous part. It's a messaging app that took in the function of a social media platform. It did so without robust security features like end-to-end encryption yet it advertised itself as heavily encrypted. Like Green stated in his blog post, users expect that to mean only recipient can read what you say, i.e. end-to-end encryption.
Telegram would be fine if it advertised itself as a public square of the internet, like Twitter does. Instead, it lures people into false sense of security for DMs and small group chats, which is what Green's post and thus this thread is ultimately about.
Free API doesn't mean anything until they fix what's broken, i.e. provide meaningful security for cases where there's reasonable expectation of it.
> a social media platform. It did so without robust security features like end-to-end encryption
Most social media platforms doesn't support e2ee.
Some chat apps do support e2ee but also requires a god damn phone number to login (yeah so does telegram), this makes "encryption" useless
because authorities just ask the teleco to hand out the login SMS code.
The free API is amazing I have so many little helper bots that help me automated my life. It's easy better easier and more feature rich than twilio or slack. I made my own stock management bot that ate a screener spreadsheet I upload in the chat and tell me if I should sell my stocks.
There is even that freqtrade bot that runs on telegram, even RSS bots. It really is amazing. So easy to use for chat ops.
Most "normal" people use messaging app and social medias DM interchangeably.
For instance 2 days ago my partner wanted to show me a message her friend sent, went to whatsapp and couldn't find it then realized said friend had used instagram DM for that. Most people don't care enough.
> It's a messaging app that took in the function of a social media platform. It did so without robust security features like end-to-end encryption yet it advertised itself as heavily encrypted.
Do you want to say that social networks must implement E2E? Personally I think it is a good idea, but existing social networks and dating apps do not implement it so Telegram is not obliged to do it as well.
As for promises of security, everybody misleads users. Take Apple. They advertise that cloud backups are encrypted, but what they don't like to mention is that by default they store the encryption keys in the same cloud, and even if the user opts into "advanced" encryption, the contact list and calendar are still not E2E encrypted under silly excuse (see the table at [1]). If you care about privacy and security you probably should never use iCloud in the first place because it is not fully E2E encrypted. Also note, that Apple doesn't even mention E2E in user interface and instead uses misleading terms like "standard encryption".
This is not fair. Apple doesn't do E2E cloud backups by default and nobody cares, phone companies do not encrypt anything, Cloudflare has disabled Encrypted Client Hello [2], but every time someone mentions Telegram, they are blamed for not having E2E chats by default. It looks like the bar is set different for Telegram compared to other companies.
It’s not encrypted by default, and even if it were encrypted, you should never trust any connected device with anything important. That being said, Telegram is hands down the best communication platform right now. It is feature-rich, with features implemented years ago that are only now being added to other platforms. It has normal chatting/video calls, groups, channels, and unlimited storage in theory, all for free. I just hope it doesn’t go downhill after what happened these last days because there’s no proper replacement that fulfills all Telegram features at once.
Signal has probably the worst UX of any messaging app. It also used to require sharing phone numbers to add contacts, which imo is already a privacy violation.
Telegram is fast, responsive, gets frequent updates, has great group chat, tons of animated emojis, works flawlessly on all desktop and mobile platforms, has great support for media, bots, and a great API, allows edits and deleting messages for all users, and I really like the sync despite it not being e2e.
The worst UX you can provide. Clumsy, slowly switching views, search worse than on WhatsApp, stickers like from 2005, no formatting, no bot API (of course there are few "hacked" ones implementations, but is it really the way?), margin and padding bloated UI.
# No smooth animations - that's makes Telegram stand out from everything else here, but maybe not everyone is happy when 6-core phones can deliver something more than 60fps in 2024...
That's what I remember and yes - mostly those are probably easy to fix UI/UX features/bugs, but even being open-source - they aren't.
Telegram is great for large groups. It's better to compare Telegram to Reddit than Signal.
Signal is excellent for tiny groups of known participants. I prefer it over anything else for this use case. The group permissions Signal introduced a few years ago are well suited for that purpose. I've recently started running small groups on Signal with about 100 participants who mostly know each other, but not tightly. The recent addition of phone number privacy makes this feasible.
Once you start moving up in scale you really need moderation tools, and Signal doesn't do so well there. When you have thousands of people and it's open to the public you need to moderate or else bad actors will cause your valuable contributors to leave. Basic permissions like having admins who can kick people out and restricting how new members can join only gets you so far.
The issue is that in Signal there is no group as far as the server is concerned: The state of the group exists only on client devices and is updated in a totally asynchronous manner. As a consequence it is more difficult for Signal to provide such features. For example, Signal currently has no means to temporarily mute users, to remove posts from all group members, easy bots to deal with spam, granting specific users special privileges like ability to pin messages, transferable group ownership as opposed to a flat "admin" privilege, etc.
Think about the consequences of Signal's async nature with no server state: What does it mean to kick someone out? An admin sends a group update message that tells other clients to stop including that user in future messages. Try this: Have a group member just delete Signal and then re-register. Send a message to the group. They're still in the group. You get an identity has changed message. These are really only actionable with people who you know... that is, in tiny groups.
And then, the biggest strengths of Signal, which are its end to end encryption and heroic attempts to avoid giving the server metadata, are less valuable in the context of a large public group: Anyone interested in surveilling the group can simply join it, so you have to assume you're being logged anyway. Signal lacks strong identities as a design choice, so in big groups it's harder to know who you're really talking to like you know that "Joe Example, founder of Foo Project" is @Foo1988 on Telegram and @FooOfficial on X and u/0xFooMan on Reddit.
This is one of those questions where it's hard to answer but it's obvious once you use it.
What's the difference between a fiat and a ferrari? What's the difference between CentOS and Linux Mint? What's the difference between a macdonalds and a michelin burger?
I have friends and groups on both platforms. On Signal, I'm basically just sending messages (and only unimportant one, like, when are we meeting. Sending media mostly sucks so I generally only have very dry chats on Signal).
Whereas on Telegram, I'm having fun. In fact it's so versatile, that my wife and I use it as a collaborative note-taking system, archiver, cvs, live shopping list, news app (currently browsing hackernews from telegram), etc. We basically have our whole life organised via Telegram. I lose count of all the features I use effortlessly on a daily basis, and only realise it when I find myself on another app. This is despite the fact that both Signal and whatsapp have since tried to copy some of these features, because they do so badly. A simple example that comes to mind: editing messages. It took years for whatsapp to be able to edit a message (I still remember the old asterisk etiquette to indicate you were issuing a correction to a previous message). Now you can, but it's horrible ux; I think you long press and then there's a button next to copy which opens a menu where you find a pencil which means edit, or sth like that. In telegram I don't even remember how you do it, because it's so intuitive that I don't have to.
Perhaps that's why I find the whole "Telegram encryption" discussion baffling to be honest. For me, it's just one of Telegram's many extra features you can use. You don't have to use it, but it's there if you want to. I don't feel like Telegram has ever tried to mislead its users that it's raison d'etre is for it to be a secret platform only useful if you're a terrorist (like the UK government seems to want to portray it recently).
I get the point about "encryption by default", but this doesn't come for free, there are usability sacrifices that come with it, and not everyone cares for it. Insisting that not having encryption by default marrs the whole app sounds similar to me saying not having a particular set of emojis set as the default marrs the whole app. It feels disingenuous somehow.
> What's in Telegram that you don't see in Signal?
The first feature that comes to mind for me is being able to use multiple devices. Signal only allows using it with one phone. If you add a second device, the first one stops working. You can use a computer and a phone, but not multiple phones. Telegram supports this without any issues. I still struggle to understand this limitation.
User base, large groups (I think the max is 200k members), channels, bots to automate work, animated stickers, video messages (not the calls one), and video/voice calls within the group (not sure if Signal has that), file storage and file sharing, multiple devices without worrying about losing messages -and you might mention the security part and that’s ok, I want the accessibility, if I want security I will look somewhere else- among other features. Those are on top of my head.
Anecdotal evidence, so take this with a grain of salt - I work with a bunch of people from Ukraine and almost all of them exclusively use Telegram to keep up with the news and family back home. From talking to them for a while, it's mostly because it's free, has excellent support for sync across multiple devices (including audio, video and other media), has support for proxies to circumvent any kind of blocking, public channels for news updates.
Honestly it would be better if Telegram dropped the facade of having E2EE. It's generally very low on the priority list of most people anyway, as much as it would hurt anyone reading this, but that's the truth. People are not using it for secure messaging, but for a better UX and reliability.
EDIT: Telegram does require a phone number to sign up.
Not a single person I know who uses Telegram cares about or thinks of it as e2ee. Whether "techie" or "non-techie" (whatever the definition of that is). People use it because it has a nice interface, was one of the first to have good "sticker" message support (yes, a lot of people care about that kind of stuff), and of course because of the good old network effect.
It's only on HN I ever see people set up Telegram as some supposed uber-secure private app for Tor users and then demolish that strawman gleefully.
Do you read other news sites that mention Telegram or is this an N=1 situation?
Today, on the same topic, another tech site which generally gets a lot of things right (but whoever is responsible for writing about Telegram, or maybe their internal KB, is consistently wrong and doesn't care about feedback) wrote that it is an encrypted chats service: https://tweakers.net/nieuws/225750/ceo-en-oprichter-telegram... ("versleutelde-chatdienst" means that for those fact checking at home)
You could also ask about whether they think it's private. And if they say yes, ask them what it means. Does it mean only sender and intended recipients can read the message, or is it fine if the service has someone check the content. Would they agree on the notion "it's OK my nudes I send to my SO are up for grabs for anyone who hacks Telegram's servers", or do they think should Telegram plug this gaping hole.
Also, people tend to state they have nothing to hide, when they feel they have nothing to fight with. But I can't count the number of times I've seen a stranger next to me on a bus cover their chat the second I sit next to them. Me, a complete random person with no interest in their life is a threat to them.
For the past few weeks I've been using Telegram to create my own cool sticker and when talking with people in whatsapp (eughh) I find myself having trouble finding the words my telegram stickers would mean
Telegram is mostly used by people in the US for drug deals and chatting with people in Eastern Europe, so it's very common to believe it's a secure messenger.
Amplified by journalists, and most frustratingly to me even some techies that just can't be bothered to properly examine all available facts despite their technical capabilities to examine them.
100% this. most people do not realize that all those non-secrete messages from private chats and group chats are stored in database that people at telegram has access to.
I’d guess (not gonna test it but it feels reasonable) that “almost every non-techie” has a very vague idea of what e2ee even is, so it’s not clear where the worst part comes from. Pretty sure the best ideas they have about security are from hacker movies best case on average.
BS. Vast majority of non-tech users do not, for a simple reason that they can't know it even if they cared, and they do not. Even tech users can't be bothered to read links to the faq on tg site.
There is so much misinformation around telegram that alone made me trust it more (if a known liar tries to discredit something, it increases chances of it being good--it is about comments here on HN).
I am null at cryptography but thie following does not sound too bad as a default tbh. And I think it is misleading to focus solely on e2ee and not mention the distributed aspect.
> To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions. The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.
> Thanks to this structure, we can ensure that no single government or block of like-minded countries can intrude on people's privacy and freedom of expression.
> Telegram can be forced to give up data only if an issue is grave and universal enough to pass the scrutiny of several different legal systems around the world.
> To this day, we have disclosed 0 bytes of user data to third parties, including governments.
You can coherently argue that encryption doesn't matter, but you can't reasonably argue that Telegram is a serious encrypted messaging app (it's not an encrypted messaging app at all for group chats), which is the point of the article. The general attitude among practitioners in the field is: if you have to reason about how the operator will handle legal threats, you shouldn't bother reasoning about the messenger at all.
Indeed. Even being charitable and assuming that they're not lying (they say elsewhere that they've shared zero bytes with law enforcement, despite this being demonstrably false), in reality if say, they were to arrest the founder in an EU country (France, perhaps), all they need to do is threaten him with twenty years in prison and I'm sure he'll gladly give up the keys from all the different locations they supposedly have.
Given that users can access their messages without interaction with people at Telegram, automatic aggregation of the cloud data for single end points is in place.
In consequence the data can be accessed from a single jurisdiction anyways.
Wouldn’t being forced to give up the password and logging in be a violation of the 5th amendment, at least in the US? I think it’s a mixed bag of rulings right now, but it seems like it would make sense for it to fall that way at the end of the day.
The problem with this approach is that it relies on governments accepting your legal arguments. You can say "no, these are separate legal entities and each one requires a court order from a different country" all you want, but you also need to get the courts themselves to agree to that fact.
Problem with this claim is that it's hardly verifiable. Telegram's backend is closed source, and the only thing you can be sure of is that their backend sees every message in plaintext.
Maybe hijack the key and message before it gets distributed. Or just get after the pieces themselves if they are from Chinese or Russian authorities. Or just threaten to close the local data center if they do not collect the pieces from elsewhere, see if they can be convinced to hand over what they have, regardless where they put it.
We can be null in cryptography, but handing over both the secret and the key to this secret to the very same person is quite a trustful step, even when they say 'I promise I will not peek or let others peek, pinky promise!' - with an 'except if we have to or if we change our mind' in the small prints or between the lines.
> Translated: Contrary to what has been publicly stated so far, the operators of the messenger app Telegram have released user data to the Federal Criminal Police Office (BKA) in several cases.
> Telegram has complied with an order from the High Court in Delhi by sharing user details of copyright-infringing users with rightsholders.
Anyways just some examples in which their structure doesn't matter. In the end, user data is still given away. It's also why e2ee should be the sole focus. Everything else is "trust me bro it's safe" levels of security.
>To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions.
In practice also didn't work, only one government was needed to arrest the guy. And now all they need is a hammer or some pliers. No need for multiple governments to coordinate.
> The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.
Or the CEO and owner, staring down the barrel of a very long time in prison, obtains the keys from his employees and provides them to the authorities.
Would he do this? To me, it matters little how much I trust someone and believe in their mental fortitude. I could instead rely on mathematical proofs to keep secrets, which have proven to be far better at it than corporations.
I am wondering if there was any incident that disproved the “we have disclosed 0 bytes of user data to third parties, including governments.” statement.
Clearly the investigating authorities are not buying that argument because, well,
it's completely absurd. Both technically and legally, Telegram are in control of those keys, regardless of where they are hosted.
That’s Telegram's CEO saying how he and his employees were “persuaded and pressured” by US FBI agents to integrate open-source libraries into Telegram (1).. There are a lot of questions to ask, like if the open-source libraries are indeed compromised, among other things. I take it as this arrest was the final straw to pressure him to give up and hand over some “needed” data, as all the accusations I read are laughable. Instagram is full of human trafficking and minor exploitation, drug dealers, and worse. The same goes with other social media, and I don’t see Elon or Zuck getting arrested. I am confident that this arrest is to obtain specific information, and after that, he will be released, or spend 20 years if he doesn’t comply.
"At St. Petersburg State University, Mr. Durov studied linguistics. In lieu of military service, he trained in propaganda, studying Sun Tzu, Genghis Khan and Napoleon, and he learned to make posters aimed at influencing foreign soldiers."
You really think the FBI would casually go to Durov and start telling him which libraries to deploy in his software.
This "They're trying to influence me that means its working" 5D-chess is the most stupid way to assess security of anything.
There's nothing to backdoor because it's already backdoored:
Code does not lie about what it does. And Telegram clients' code doesn't lie it doesn't end-to-end encrypt data it outputs to Telegram's servers. That's the backdoor. It's there. Right in front of you. With a big flashing neon light says backdoor. It's so obvious I can't even write a paper about it because no journal or conference wouldn't accept me stating the fucking obvious.
I do wonder if this would hold up though, if telegram stored each character of your chat in a different country, would a single country not be able to force them to hand over the data and either fine them or force them to stop operating if they wouldn't share the full chat? It seems like a loophole but I don't know what the precedent is.
Oh, I must have missed this. Please tell me how to enable secret chats for groups. And my desktop chats. Also I'd like to turn on the setting for defaulting to secret chats whenever I open a new one. Oh? I can't. Sounds like it's not there if I want it, after all. Good thing they didn't force it to me though /s
It's because Telegram is marketing itself as a secure messaging app, and because journalists continuously present it as such while discussing the arrest of its CEO.
Exactly this. It is all about how they market themselves. If they had promoted themselves as a social media-ish platform, nobody would be causing a fuss about their encryption.
Neither discord, nor any of the popular IRC clients (HexChat, WeeChat, mIRC) even mention the word security or privacy to promote their products.
Moreover, as Mathew Green mentioned in his blog post, there are many instances where Telegram (or Pavel Durov) has gone out of his way to attack the encryption offered by Signal and WhatsApp.
If he were pitting his messenger against discord, why would he be worried about Signal or WhatsApp?
Thanks for the blog post, now I finally have a good resource I can point people to next time they claim Telegramm is secure.
> I am not specifically calling out Telegram for this, since the same problem [with metadata] exists with virtually every other social media network and private messenger.
Notably, Signal offers a feature called Sealed Sender[0]. While it doesn't solve the metadata problem entirely, it does at least reduce it a bit.
Interesting, I feared Sealed Sender might be susceptible to statistical analysis (hence my phrasing "reduce it a bit") but it's worse than I expected ("Signal could link sealed sender users in as few as 5 message"). Thanks for the link!
As for TOR, that wouldn't really help much, would it, given that the described attack is at the application level of Signal. Or are you talking about not using Signal altogether?
This is part of what I love about Mastodon: if you PM someone, very often you're talking between two random servers and odds are good that the admin is a friend of a friend. No dragnet statistical analysis stuff, just friends running some software that normal people can also use. Distributed systems at their best
If telegrams encryption is so bad why is Pavel Durov under arrest?
The arrest cites that he was not cooperating with authorities to crack down on various drug illegal activities on telegram. None of the other social networks have their ceos arrested. Is it simply that telegram is the only one without backdoors for five eyes?
It seems to me the secret chat feature actually works too well?
I'd suggest waiting for more details from French officials, they have already said that they'll address it tomorrow. So far claims from the media sound like Durov's being prosecuted due to very little moderation on the platform, not because of E2EE.
Even so, most messages sent on Telegram are plaintext, they're encrypted only in transport layer, but Telegram's servers see them in full. Secret chats (the only E2EE chats on Telegram) are hidden away from the users, hence the original link.
Telegram channels are public, unencrypted web shops for all kinds of illegal goods. I guess the French government alleges that Durov is not doing enough to stop these activities on his platform.
It doesn't necessarily have anything to do with encryption.
It indirectly has a lot to do with encryption, in that if Telegram was actually encrypted, they'd probably have no grounds on holding him in the first place.
(At least at the moment, in most countries) it's not illegal to not ship a backdoor in your end-to-end-encrypted software upon government request, but in most it is illegal to not share data you're holding in a form accessible to you when you receive a warrant for it.
The difference between telegram and others is that in telegram you can type "<city> drugs" to global search and find groups with drug dealers and buyers near you instantly. I don't think his arrest has anything to do with the level of encryption at all.
Personally I find Telegram kind of refreshing in nowadays internet landscape where everything is so sanitized. You can discover all kinds of niches you never knew existed.
> Is it simply that telegram is the only one without backdoors for five eyes?
Do you honestly think that any backdoor would be used for such mundane crimes? Even more so, it being in any way acknowledged that there might be a backdoor?
On that topic, it's highly likely Telegram is cooperating with Russian LE. Services and people that don't get thrown out quickly in Russia.
> The arrest cites that he was not cooperating with authorities to crack down on various drug illegal activities on telegram. [...] None of the other social networks have their ceos arrested.
Because if you want to operate in any country, you're either cooperating with the authorities or you'll get shut down or arrested. Hiding evidence you have is not tolerated anywhere.
I can give you some insight into why EU law enforcement and politicians dislike telegram. It’s not because they can’t snoop on you, it’s because Telegram fails to comply with moderation requests for channels where illegal content is shared.
We had a nice scandal of sorts here in Denmark where a bunch of young men shared pictures of young women without consent. If you’re old enough to remember those old “rate this girl” web pages from the 90ies you’ll know what the pictures were used for. Basically it was a huge database on hot girls in Denmark and where they went to school. Today around 1000 young men have that on their permanent record as Facebook worked with law enforcement to catch the criminals. Telegram doesn’t do that. This was even a little more innocent that it may sound, considering the men were at least aged similar to the women they were sharing pictures of. Disgusting and illegal, but Telegram houses far worse and refuses to deal with it.
I know a lot of tech minded people are up in arms over this, but it’s really mainly about not wanting an unmoderated social network. Not because big brother is angry, but because people use it to organise bullying, share revenge porn, sell drugs and far, far, worse. There is also political factions within the EU who rants to kill encryption (though they were severely weakened when the brits left), but the anger against SoMe platforms is much more “European”. In that we (and I say this as the EU culture in general, not as in 100% of us) tend to view the people who enable bad behaviour as being participating in that behaviour. Platforms like Facebook, Twitter, Instagram and YouTube have been sort of protected by being early movers with mass adoption. Being American companies probably helps as well considering EU / US relations. Telegram never had such advantages, and is further disadvantaged by how its almost exclusively used for crime in Western Europe.
Obviously banning the platform won’t help. There will just be another platform. But then, we’ve also been losing a drug war for 50+ years even though we can’t even keep drugs out of our prisons.
>If telegrams encryption is so bad why is Pavel Durov under arrest?
Because it was so bad he had access to all that content, and because he had access to it, he should have moderated it, and because he didn't he's now arrested.
>Is it simply that telegram is the only one without backdoors for five eyes?
Telegram doesn't have a backdoor. Its open source client can be used to verify it leaks every group message, and every desktop message you ever send, to the service provider without ever applying secret-chat grade encryption
>It seems to me the secret chat feature actually works too well?
Well, Signal can be used to verify its end-to-end encryption is actually used everywhere, but nobody's calling for arresting Moxie or Meredith. So maybe playing 5D-chess over the news isn't working, unless you're here just to amplify this ridiculously fallacious line of thinking.
The arrest was about the expected removal of illegal and harmful content in groups, that masses see, so no enryption involved. Did you not read the news - AND the blog - in full?....
I am amazed at the low quality comments here. Encryption really doesn’t matter as much as the trust of the app here. Any malicious app author can 100% secure encrypt everything in wire and yet leak 100% of your data to some state actor. Anything you type into the chat box is only encrypted by the app after you type and probably storing it in the clear in some local SQLite db. It gives them a whole bunch of options to mess with that plain text data. Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.
>Encryption really doesn’t matter as much as the trust of the app here. Any malicious app author can 100% secure encrypt everything in wire and yet leak 100% of your data to some state actor.
This is exactly the problem with Telegram. Telegram defaults to client-server encryption for everything, and you can't enable end-to-end encryption for anything on desktop, or group chats ever. Only 1:1 chats and calls on mobile have end-to-end encryption. Client-server encryption is exactly the "100% secure encrypt in wire". When that data arrives to the server, it's no longer encrypted, and Telegram can do whatever it wants with that data, including leaking it to some state actor (like FSB/SVR).
>Anything you type into the chat box is only encrypted by the app after you type and probably storing it in the clear in some local SQLite db.
If endpoint security is of concern, your options with networked TCBs are quite limited. Are you sure the malware doesn't have a chance to escalate its privileges and read messages in clear from RAM?
>It gives them a whole bunch of options to mess with that plain text data.
I'm looking forward to hearing about how you managed to fix this. Should we implement memory as eFuses (https://en.wikipedia.org/wiki/EFuse) to prevent editing logs? What if the user wants to delete his messages?
>Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.
E.g. with Signal android, you can pull off the APK from the device, and compare its hash against the client that was reproducibly built from the source code you have in your possession. Been there done that https://imgur.com/a/wXYVuWG
>I am amazed at the low quality comments here.
Too bad you're not exactly improving them with your nonsense.
> malicious app author can 100% secure encrypt everything in wire and yet leak 100% of your data
Um, surely you understand the difference between piping random-looking bytes uselessly to whoever and having a readable copy of all data readily available to whoever hacks the system or applies for a sysadmin role? Or are you making the assumption that people use a closed-source client and the server can push malicious code?
> Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.
Doesn't work if you have third parties also working with the system or forking the code to work with it. It gets noticed. Your concept of "e2ee can be 100% leaked anyway" only works if you don't know what code you're running. You need to trust the community in general to uncover issues you've overlooked (in the code or build process) but that's not the same as not having encryption at all. You can't audit the servers but you can audit the client code.
> You need to trust the community in general to uncover issues
My point is that this community could just be your friendly CIA operatives running the show with a veneer of open source. Also this “community” has no liability unlike the closed platform companies.
I use it for friends, family and partner, videocalls and normal chat.
Sure, it may not be on the same level as Signal when it comes to security but it simply is leagues above others in terms of usability, stability and bells&whistles. It's like comparing a Ford Zephyr with a Volvo EX30.
I agree, but I wouldn’t compare Signal to a Zephyr. Classic cars have that charm and magic. I would say Signal is more like a Honda Civic; its users are loud and annoying, and yet it’s mediocre in all categories. :)
Only the secret chat is e2e encrypted. All the other chat options are not. I think calls are also not encrypted since they appear in the normal chat history not in the e2e chat.
Obviously if your phone is compromised your e2ee chat is not safe.
> Obviously if your phone is compromised your e2ee chat is not safe.
Pretty much, a lot of people think that seeing E2EE means everything is safe, which I believe gives a false sense of security. You can have your phone compromised (especially when I know your phone number, Signal I’m looking at you) or be subject to other means of attacks, exposing everything. I would rather know that this app is not secure so I don’t share anything important, while keeping secure communication to other means.
Not only that. If they want to intercept e2e chats it's possible with a MITM attack, that if you control the server it's not a difficult thing to do. Of course the users if they check the keys they see they are different, but practically no one does that.
And I think WhatsApp probably does it, otherwise why the authorities never complied that WhatsApp did not let them see the conversations?
Stealing someone's phone number wouldn't give you any Signal data, as all the messages have perfect forward secrecy, though, right? And all contacts would see an alert that your security number had changed. Not completely foolproof, and I would like Signal to use something other than phone numbers for accounts, but it's pretty good.
>You can have your phone compromised (especially when I know your phone number, Signal I’m looking at you) or be subject to other means of attacks, exposing everything.
Knowing someone's phone number doesn't automatically let you compromise their device. This is such a ridiculous argument.
>I would rather know that this app is not secure so I don’t share anything important, while keeping secure communication to other means.
This is nirvana fallacy. It's essentially saying "We should not talk about Telegram lying about its security, when in reality nothing is 100% secure". Yeah, nothing is, there's always an attack. That doesn't contribute anything of interest to the topic, it just tries to kill the criticism. And I'm saying this as someone who has worked on this exact topic for ten years: https://github.com/maqp/tfc
Depends on who your adversary is and how much you trust their protocol (some weird homegrown thing with clever/questionable cryptographic choices, the last time I checked) and implementation. Your texts don't generally run through Telegram's infrastructure, for example.
> Obviously if your phone is compromised your e2ee chat is not safe.
Yes, and that's where the 'practical' argument pops up. With all the E2EE buzz, is it really helping in the scenarios where it's supposed to work the best?
> The broader problem of ephemeral or spur of the moment protest activity leaving a permanent data trail that can be forensically analyzed and target individuals many years after the fact is unsolved and poses a serious risk to dissent. But E2E is not the solution to it.
> I feel like Moxie and a lot of end-to-end encryption purists fall into the same intellectual tarpit as the cryptocurrency people, which is that it should be possible to design technical systems that require zero trust, and that the benefits of these designs are self-evident
> One of the biggest privacy problems in messaging is the availability of loads of meta-data — essentially data about who uses the service, who they talk to, and when they do that talking. […] the same problem exists with virtually every other social media network and private messenger.
Avoiding any metadata leaks without generating tons of cover traffic (to frustrate timing correlation attacks) is very hard.
Signal does indeed use an architecture (at least for chats with contacts, or optionally everyone when you enable the "sealed sender" option that makes you a bit more prone to receiving spam) where Signal doesn't know who's sending a given message from a given IP address, and only which account it's destined for.
But any entity in position to globally correlate traffic flows into and out of Signal's servers can just make correlations like "whenever Alice, as identified by her phone's IP, sends traffic to Signal, Bob seems to be getting a push notification from Apple or Google, and then his phone connects to Signal, so I think they're talking".
How accurate does the timing need to be? I imagine there must be many Bobs getting notifications around the same time. Also, if I use Signal behind a VPN is it still known that I’m talking to the Signal servers?
> Is this true for Signal too? I thought it wasn’t.
It is, because you cannot use Signal without giving them your mobile phone number, and from that point onward they (and anyone they might be sharing data with) know the who/what/when, and more. My gut feeling, notwithstanding any apologist and their weak arguments, is that the design choice is exactly about the who/what/when because it's mandatory despite being entirely unnecessary from a technical perspective.
Of course not. The genius of Durov was in discovering that users don't really need e2ee and all the drawbacks that come with it, and that promising them that the app has really strong encryption is good enough even without actual encryption.
>Many systems use encryption in some way or another. However, when we talk about encryption in the context of modern private messaging services, the word typically has a very specific meaning: it refers to the use of default end-to-end encryption to protect users’ message content. When used in an industry-standard way, this feature ensures that every message will be encrypted using encryption keys that are only known to the communicating parties, and not to the service provider.
>From your perspective as a user, an “encrypted messenger” ensures that each time you start a conversation, your messages will only be readable by the folks you intend to speak with. If the operator of a messaging service tries to view the content of your messages, all they’ll see is useless encrypted junk. That same guarantee holds for anyone who might hack into the provider’s servers, and also, for better or for worse, to law enforcement agencies that serve providers with a subpoena.
>Telegram clearly fails to meet this stronger definition for a simple reason: it does not end-to-end encrypt conversations by default. If you want to use end-to-end encryption in Telegram, you must manually activate an optional end-to-end encryption feature called “Secret Chats” for every single private conversation you want to have. The feature is explicitly not turned on for the vast majority of conversations, and is only available for one-on-one conversations, and never for group chats with more than two people in them.
The worst is that Telegram Secret Chats are limited in functionalities, compared to the normal ones, for no reasons. Stickers set don’t work, for exemple, and that’s one of the main feature of Telegram chats.
For me Telegram is more like an uncensored Twitter slash blog platform. I use it to check out public channels for updates and that's about it. For private communication, I use Whatsapp. So, lack of e2e by default is not an issue for me at all.
I guess you meant to say Discord is worse Telegram that was created earlier. Though obviously many groups features got into telegram somewhat at the same time as Discord gain traction.
Still not indexable, referencable, or freely readable
It's a walled-garden system which is fine for private chats between groups of friends, but Discord is increasingly being used as a place to report bugs and share information. Telegram furthermore requires signing up with a phone number which Discord did not (now, often, you need to for participating when an admin of a community aka gild aka misnomer "server" turned on that requirement)
https://xkcd.com/979/ This comic will not be understood by gamers growing up today... (Except in many cases someone posted a solution or nudged DenverCoder9 in the right direction at least; with Discord, Slack, or Telegram you'd simply never find the thread in a search engine to begin with.)
> Discord is increasingly being used as a place to report bugs and share information.
So is telegram. I'm in numerous groups with developers of linux distros and other apps. Many developers uses telegram's channels to post updates about their works.
As mentioned in a comment to one of your posts, the GNUnet people have probably gone the furthest in the quest to obfuscate metadata. Unfortunately, to this day no usable messenger application has come out of this, partially because GNUnet has largely been a research project.
As for applications in use today that address the metadata problem, have a look at Signal's Sealed Sender feature: https://signal.org/blog/sealed-sender/
As for recommending Telegram for secure messages, I side with the sibling comments ("Don't").
Since you seem to focus on decentralized protocols, I should add: In practice, while we all like federated and p2p apps for the freedoms & this warm fuzzy feeling they provide us with, by default they tend to have a much greater attack surface when it comes to metadata. This is because, compared to a centralized approach, metadata is openly available to far more parties. As a result, 3-letter agencies often won't even need a warrant to get their hands on the metadata: They can simply run traffic analysis and/or participate in the network themselves.
> I was just recommending Telegram as alternative to WhatsApp
If you care about privacy and security, please don't. Defaults matter, and private chats are effectively unusable for anyone using more than one device or needing group chats. And that's not even considering their strange home-baked cryptography.
I am recommending both. The problem is that Signal (which I use along with the other messaging apps) is that it is not feature rich as the other 2 and Signal is not popular so ppl download it just to interact with one person (Me) whereas Telegram has more user base.
For metadata you first want to remove the obvious identifiers, phone numbers, names. You'd want to use something like anonymous@jabbim.pl for your IM account.
Next, you'd want to eliminate the IP-addresses from server, so you'd want to connect exclusively through Tor. So you'd set the IM client proxy settings to SOCKS5 localhost:9150 and run Tor client to force your client to connect that way. This is error-prone and stupid but let's roll with it for a second.
Now jabbim.pl won't be able to know who you are, but unless you registered your XMPP account without Tor Browser, you're SoL, they already know your IP.
A better strategy is to use a Tor Onion Service based XMPP server, say
4sci35xrhp2d45gbm3qpta7ogfedonuw2mucmc36jxemucd7fmgzj3ad.onion (not a real one), and you'd register to it via IM client. Now you can't connect to the domain without Tor, so misconfiguring can't really hurt.
So that covers name and IP. We'll assume the content was already end-to-end encypted so that leaks no data.
Next, we want to hide the social graph, and that requires getting rid of the server. After all, a server requires you to always route your messages through it and the service can see this account talks to this account, then to these ten accounts, and ten minutes later, those ten accounts talk to ten accounts. That sounds like a command structure.
So for that you want to get rid of the server entirely, which means going peer-to-peer. Stuff like Tox isn't Tor-only so you shouldn't use them.
For Tor-only p2p messaging, there's a few options
https://cwtch.im/ by Sarah Jamie Lewis (great, really usable, beautiful)
https://briarproject.org/ (almost as great, lots of interesting features like forums and blogs inside Tor)
>On a side note, I was just recommending Telegram as alternative to WhatsApp
Don't. Telegram and WhatsApp both leak meatadata, but WhatsApp is always end-to-end encrypted. Telegram is practically never end-to-end encrypted. I'd use WhatsApp over Telegram any day. But given that unlike WhatsApp, Signal is open source so you know the encryption works as advertised, it's the best everyday platform. The metadata free ones I listed above are for people in more precarious situations, but I'm sure a whistleblower is mostly safe when contacting journalists over Signal. Dissidents and activists might find Cwtch the best option however.
It is weirdly fascinating that this question has to be answered on a semi-regular basis. I am not sure if it is more of an insight into humans, ephemeral nature of software or concern that something major has changed.
Or it's just nerds who are stupid and don't understand what matters in real world security for most people.
The fact that you can create a huge group and channels without sharing your phone and contacts is what made Telegram big.
You couldn't do that on WhatsApp until a few months ago. And it has been on Telegram for years. Why Hong Kong protesters used Telegram and not Whatsapp? read this: https://x.com/Pinboard/status/1474096410383421452
The fact that Telegram is massively used in both Ukraine and Russia shows that its model cannot be ignored.
I think it’s helpful because, as the author says, Telegram put effort into making you think it’s secure and Signal isn’t. As someone who's not close to this, it’s handy to have regular reminders.
>One of the biggest privacy problems in messaging is the availability of loads of meta-data — essentially data about who uses the service, who they talk to, and when they do that talking.
>I am not specifically calling out Telegram for this, since the same problem exists with virtually every other social media network and private messenger.
In fact, https://simplex.chat/ is the only messenger with the least amount of metadata.
This snake oil is spreading like [Herpes] Simplex .
Again, the company lies about queues (a programming technique) being a privacy feature.
The application can not get rid of the metadata of server knowing which IPs are conversing, unless the clients explicitly connect to the service via Tor. The server must always know from which connection to which connection it routes packets. It's not a network hub, it's a switch, after all.
https://cwtch.im/ and https://briarproject.org/ route everything through Tor always, and they don't have server in the middle, which means there is no centralized authority to collect metadata. It's light years ahead of what Simplex pretends to offer.
One of the biggest, more significant as well as successful Internet-scale cons of the last decades that I can think of, apparently perfectly executed too.
This article discusses a well known point about telegram. But only to techies. Vast majority of users are misled by journalists many of whom have degrees in social "science", political "science" etc. It doesn't say you need encryption that's for each person to decide perhaps for each conversation. It's need to be an educated choice.
Though it's old hat better to recycle this often so many know.
Reads like a hit piece on Telegram from a crypto expert who couldn't be bothered to explain in more than one paragraph why the app he is calling not an encrypted app (according to how he personally thinks everyone refers to when talking about encryption) actually uses some encryption technology that he's not exactly sure of but suspects is insecure.
He specifically explains what people think an encrypted app is:
>Many systems use encryption in some way or another. However, when we talk about encryption in the context of modern private messaging services, the word typically has a very specific meaning: it refers to the use of default end-to-end encryption to protect users’ message content. When used in an industry-standard way, this feature ensures that every message will be encrypted using encryption keys that are only known to the communicating parties, and not to the service provider. From your perspective as a user, an “encrypted messenger” ensures that each time you start a conversation, your messages will only be readable by the folks you intend to speak with.
So and encrypted messaging app means to people the security that an end-to-end encrypted app provides.
He then explains how Telegram is not end-to-end encrypted.
* No end-to-end encryption by default
* No end-to-end encryption for groups, not even small groups.
To add, there's no end-to-end encryption for desktop chats either. And no end-to-end encrypted cross-platform chats either.
Your post reads like dollar-store damage control team post that didn't even read the article they're trying to discredit.
TLDR: 99.95% of messages on Telegram stored as plain text on their servers and only encrypted between client and telegram server. End-to-end encryption only working for 1on1 chats, not available half of their clients and have terrible UX.
All this is just wrong. I wonder why HN likes throwing up wrong information about Telegram as fact. Is taking up 5 mins to proof these claims that hard?
> 99.95% of messages on Telegram stored as plain text on their servers and only encrypted between client and telegram server.
Wrong and OP doesn't even mention plain text. The non-E2EE client-server data is stored encrypted sparsed out in various servers to different countries.
https://telegram.org/privacy#3-3-1-cloud-chats
> End-to-end encryption only working for 1on1 chats, not available half of their clients and have terrible UX.
Wrong again. I actually recently checked this for myself their official clients on Android and Linux desktop have support for MTProto 2.0. Feel free to check if other OS don't support this feature. The only clients I know where this is not enabled are the web clients.
Something that might be interesting in this topic - forked version [0] of telegram client made during protests in Belarus in 2020 (and appears to be actively maintained to this day). Can't vouch for it, but found it interesting.
Does anyone have any reason to believe that Telegram's E2EE doesn't have a backdoor? Because if not, then I fail to see why it matters whether the E2EE even exists in the first place.
Telegram clients are open source. Anyone can verify that the client does the end-to-end encryption correctly.
Telegram has had its own history of really weird issues with its encryption protocol, like the IGE, 2^64 complexity pre-computation attacks, IND-CCA vulnerability and whatever the hell this was https://words.filippo.io/dispatches/telegram-ecdh/
But these are not the big issues here. The issues Green's blog post highlighted were
* Telegram doesn't default to end-to-end encryption.
* It makes enabling end-to-end encryption unnecessarily hard
* It has no end-to-end encryption for groups
Those matter gazillion times more than e.g. a slightly older primitive would.
End-to-end encryption matters because Telegram is not just a social media or Twitter wall. It's used for purposes that deserve privacy, and Telegram isn't providing.
Simple question denotes whether its encrypted.....
Does cloud server store the message and key.....
If answer is yes, ITS NOT FULLY ENCRYPTED!
Sounds contrary right?
If key and message is on server any LEO org can get it....for it to be fully encrypted cloud server should never store the keys....
So how many services claiming encryption have this flaw? All....
Why do you think Telegram has shell companies to avoid gov subpeonas?
Because it knows that its encryption is faulty to real world LEO and laws as it stores the keys on the cloud which means its can be subpoenaed for those keys and messages.
>So how many services claiming encryption have this flaw? All....
Telegram is actually one of the only apps I've seen to defend their super-duper secure storage of keys online. All lies of course.
The overwhelming majority of secure messaging apps have no way to recover user data if you drop your phone in the ocean. This includes Signal, Wire, Threema, Session, Element, iMessage etc.
This is actually great blogpost since too many people tend to believe that Telegram is somehow more secure and private then alternatives on market.
Also it's not like Telegram dont have censorship. During last 3-4 years there was many cases where Durov blocked bots and channels that belong to protests and opposition in Russia, marked them as "fake" or just plain removed with no trace.
So it's just another case where some rich guy try to sell his own platform as some "freedom of speech" one even though it's just censored to his liking.
It's not e2e encrypted, so what? It's something the majority of users does not need, and that doesn't increase security that much given their downsides.
Of course for Telegram is much more convenient to not have end2end encryption. Given that they store everything on their servers, it means years of chat history that probably weights Gb for each user, contrary to what WhatsApp/Signal do, of course if 10 million people send eachother the same meme it's stupid to have 10 million copies of the same images on their servers just because it is end2end encrypted. They probably have a store where they index each media with its hash and avoid to have multiple copies, that is fine. This is the reason Telegram can offer you to have all your messages, including medias that can be up to 1Gb each, stored on a cloud for free.
As I user I prefer Telegram just because it's the only app that works perfectly synchronized among multiple devices (Android, Linux, macOS) with good quality native clients, without wasting space on my phone for data.
By the way, end2end encryption it's not that safe as they claim. Sure, the conversation can not be intercepted, however:
- you can put a backdoor on endpoints, that is compromise the user phone (something they do)
- you can make a MITM attack on the server (don't know if they do that, but technically possible)
- you can access the data that is backed up on other platforms (i.e. WhatsApp makes by default backups on Google Drive or Apple iCloud, trough which you can access all the conversations in clear text).
> By the way, end2end encryption it's not that safe as they claim. Sure, the conversation can not be intercepted, however: [...]
> - you can make a MITM attack on the server (don't know if they do that, but technically possible)
No it's not technically possible, by its very definition. The fundamental principle behind E2EE is that the server can be malicious or compromised all you want, but this does not impact message confidentiality or integrity.
>It's not e2e encrypted, so what? It's something the majority of users does not need, and that doesn't increase security that much given their downsides.
Privacy is a human right. Everyone needs it. And Telegram advertises itself as an encrypted messenger. For every non-expert, that means end-to-end encryption. Only me and recipient can read the message. Users expect Telegram to be more secure than WhatsApp. Telegram claims its more secure than WhatsApp, and Telegram has attacked WhatsApp over its security. WhatsApp is always end-to-end encrypted, Telegram is not. So don't go putting words into peoples mouths.
>Given that they store everything on their servers, it means years of chat history that probably weights Gb for each user
It could be stored there with client-side encryption, Telegram doesn't need to have access to that data. Also who says chats that are ephemeral in nature need to be forever accessible. I save what I need from Signal or Telegram.
>This is the reason Telegram can offer you to have all your messages, including medias that can be up to 1Gb each, stored on a cloud for free.
It's not free. It comes with the price of your human right to privacy. You should get a job at Facebook with this marketing pitch.
>As I user I prefer Telegram just because it's the only app that works perfectly synchronized among multiple devices
It doesn't sync secret chats at all with multiple devices, not even desktop. Signal does.
>You can put a backdoor on endpoints, that is compromise the user phone (something they do)
Nirvana fallacy. Why is Telegram offering secret chats if all endpoints are compromised? If they're not always compromised, then it should offer end-to-end encryption for everything, always. Like Signal, Whatsapp, Wire, Threema, iMessage, Cwtch, Briar, Element, Session...
Even telegram has them, although their initial implementation of babby's first QR-code was a joke. How do you compare over the phone shades of a color matrix?
>you can access the data that is backed up on other platform
Oh, that would be horrible. Good thing Telegram doesn't have its data backed up in cloud, no wait, sorry, it does. ~Everything you ever do with the app is permanently stored in an ecosystem built by the Mark Zuckerberg of Russia, and his PhD in geometry bro Nikolai.
This is such an old topic. Every time something related to the Telegram happens, somebody starts a discussion about how it's not an e2e-by-default. But the reality is nobody cares. And considering this, it's ridiculous now that Durov is detained on the accusations of being responsible for all kinds of information that's being spread in non e2e-by-default messenger.
Fascinating. I might have missed it, but I don't think the author mentioned the possibility of steganography. Just code the encrypted text such that it resembles a normal conversation.
Steganography is pointless given that encrypted and metadata protected communication is ubiquitously available to those who need it. Steganography is a niche you read about in your first year of studying the world of privacy and what you completely forget because nobody has time for spycraft when there's life to be lived. The novelty wears out faster than you can imagine.
You could use an image. But you could use text as well. E.g. you could agree on a code phrase to be said when some "dirty deed done dirt cheap" has been completed. Or you could encode a binary string by alternating British English spellings with American English Spellings: e.g. "color" means 0, "colour" means 1; "gray" means 0, "grey" means 1, etc etc. and then just use those alternate spellings in a normal conversation.
Same thing with proton mail. I have never understood the "Trust me bro we encrypt it" business model. If it's not your key on your client machine it's not encrypted.
Note that Proton Mail servers don't hold your private master key directly — it is always stored encrypted with your password. Also, Proton Mail allows you to import your keys: https://proton.me/support/pgp-key-management
No, it's definitely not. Moderation means I can run my group how I want, you can run your group how you want, and others can decide if they want to participate in either of our groups or start their own groups.
Censorship is when someone else dictates how we can run our respective groups.
I don't know how much you have used Telegram, but it's ridden with absolutely vile stuff.
You open the "Telegram nearby" feature anywhere and it's full of people selling drugs and scams. When I mistyped something in the search bar I ended up in some ISIS propaganda channel (which was straight up calling for violence/terrorism). All of this on unencrypted public groups/channels ofc (I'm pretty sure it's the same with CP, although I'm afraid to check for obvious reasons).
I think there is a line between "protecting free speech" and being complicit in crime. This line has been crossed by Telegram.
I use it a lot, and I run some large groups on it. I don't see any of that stuff, I've never gone looking for it, and I'm not even sure how to look for it. Can you tell me some examples of what to search for to see what you're talking about?
> Censorship is when a third party uses coercion to force admins to submit to them and remove posts against their will
What a weird hill to die on, given the whole context of this situation.
Do you see public recruitment of people into terrorist cells as a freedom of speech? Do you see publicly selling drugs as a freedom of speech? It isn't about censorship at all, it's about actual *illegal* activity.
Now it's up to Durov and his lawyers to prove that Telegram actually dealt with that. So far France doesn't seem convinced.
Terrorist recruitment and selling drugs is conduct, and whoever engaging in that illegal conduct can, and should, be prosecuted.
The problem I have is with requiring the chat service to police that or making its operators liable for the illegal conduct of its users.
It shouldn't be up to Durov to prove he did or didn't do anything, it's up to France to prove that he or his company actively participated such conduct. And no, people using the service to engage in the illegal acts isn't nearly enough, any more than Google's CEO should be liable for a drug dealer using Maps to navigate to the drug deal location, or Venmo should be liable for the buyer paying the seller with it.
The reason it's worth defending this "hill" is because allowing governments to use censorship as a convenient means of solving these problems always leads to more control and restrictions that infringe on the legitimate rights of everyone.
I understand the appeal of these tactics. Since we know that terrorist groups operating abroad will use chat services to incite locals to commit violence, it's tempting to search the chat service and stop that from happening by censoring the communication, preventing the radicalization. Since we know that drug sellers organize the sale of the contraband using the chat app, it's tempting to search the chat app and censor that speech, thus preventing the buyer from learning where to meet the seller. Or wait for enough speech to cross the line into conduct and then arrest them for it. Sounds great. If it would work, I'd support it.
The problem is that it won't work, and the only way to "fix it" will be to push more and more and more surveillance and control. It's already being pushed. Look at this chat control nonsense. Do you support that?
So what I'm saying, is let's just recognize that it's a basic human right for people to communicate freely and that operators of communication services shouldn't be held liable for the actions of their users.
Yes but let's also be clear that some forms of speech censorship are widely and broadly supported in public, 'town square' or broadcast media situations. Things like child porn, personal threats, calling for or organizing violence, hate speech, etc. Laws and social acceptance of this kind of censorship, of course, differ in different regions.
Hacker news may 'moderate' illegal content on this website, but they don't have a choice in the matter, US or State authorities will shut them down if they do not, so it's technically censorship. Your view on whether this is good or bad will depend on many factors, one of which may be how you view the legal structure of your government, which is substantially different in France, the US, or Dubai (where Telegram is located).
As is mentioned in the article, Telegram is not simple a 'secure messaging app'. They are also serving a role similar to Facebook, Twitter, Instagram, or TikTok. They host publicly accessible channels or public group chats with thousands of members, which are all (apparently) unencrypted and accessible to the Telegram company. It may be reasonable (both legally and socially) to expect that a company which has knowledge of public, illegal speech to take steps to remove that content from their platform.
And Durov, by choosing to be a media company and not E2E encrypt all of his user's private communications, has walked right into a situation where he needs to abide by local laws moderating/censoring illegal content, everywhere.
> Moderation is what happens here on HN: Admins have some policies to keep the conversation on track, users voluntarily submit to them.
What do you mean by users voluntarily submitting to these policies? This distinction seems key in your argument, but I don't see what alternatives to submitting I have here, making it involuntary, right?
If HN decided to ban all posts about Donald Trump that is moderation. Users voluntarily submit to this policy by participating in the site, and if they do not, they will be banned.
If the State of California required that all web sites run from their state are REQUIRED to ban all posts about Donald Trump, that is censorship.
Moderation is "your house, your rules" while censorship is someone else imposing their rules in your house.
Do you see what I'm saying? When France is talking about "moderation" of Telegram, what they actually mean is censorship.
It depends on whether the parties to the communication want that or not.
So let's say a few child molesters create a chat service and use it to send the worst, most horrible child pornography amongst themselves. Removing it is censorship, not moderation.
Look, I'm not trying argue for legalization of child pornography here. That is illegal contraband, full stop. The intent of my comment is to say "let's just call it what it is"
I think the overwhelming consensus is that child pornography is so horrible that mere possession of it must be CENSORED.
I'm not arguing that censorship is always wrong. For instance, I don't want to see public billboards of graphic sex or violence. I think it's good that we censor that, so that we aren't forced to look at things like that when we don't want to.
What is bothering me is that proponents of censorship, and especially certain proponents of it who want to use it as a tool to suppress ideas they don't like, have recently started using the word "moderation" in order to sneak their plans into policy without raising objections. The reason is because when we hear the word "censorship" we immediately think, "Whoa, hold on there, censorship is very harsh, let's take a hard look and make sure this is serious enough that resorting to censorship is justified and appropriate", whereas when we hear the word "moderation" we think, "Of course, we all appreciate someone deleting the spam and trolls who annoy us", and we're less likely to think critically about exactly what kind of expression is being legally prohibited.
The author claims that everyone refers to Telegram as an encrypted messenger, but he only provides a single example to support that. I quickly checked Google News and couldn't find any media on the first page that did the same. It feels like a manipulation.
UPDATE: anyone who downvote, I invite to check for themselves.
Try the mud puddle test: log into your account on a new device using the password recovery flow. Can you see your old messages?
If the answer is yes then law enforcement can too.
https://www.forbes.com/sites/anthonykosner/2012/08/05/how-se...
Note that the mud puddle test was originally described on Matt's very blog: https://blog.cryptographyengineering.com/2012/04/05/icloud-w... :)
And it only works because a corporation likely would want to offer this to its users as a convenient feature. If they were actively trying to hide this, they can rig the test and keep access to themselves.
1 reply →
> If the answer is yes then law enforcement can too.
Is it technically possible for them to see it: yes
Does Telegram let them see it: I don't think so. That seems to be the core issue around Durov being arrested.
They probably should implement E2EE for everything. Then they will have a good excuse not to cooperate, because they simply don't have the data.
> Does Telegram let them see it: I don't think so.
This is exceptionally naive. Even if he was arrested for not sharing with the French, what about for other countries? Was he arrested for not ever sharing or not sharing enough? Even if he, personally, has never shared, that doesn’t say anything about his employees who have the same access to these systems.
Your data is not private with Telegram. You are trusting Telegram. It is a trust-based app, not a cryptographically secure app.
If you trust telegram, that’s your choice, but just because a person says the right words in interviews doesn’t mean your data is safe.
5 replies →
Telegram is the only messaging app that I know of which brought attention to the fact that your messages go through Google/Apple notification APIs, which seems like it would utterly defeat any privacy advantage offered by E2EE
19 replies →
> Does Telegram let them see it: I don't think so. That seems to be the core issue style Durov being arrested
The UAE requires decryption keys as part of their Telco regulations.
If Telegram can operate in the UAE without VPN (and it can), then at the very least the UAE MoI has access.
They (and their shadow firms like G42 and G42's shadow firms) were always a major buyer for offensive capabilities at GITEX.
On that note, NEVER bring your personal phone to DEFCON/Blackhat or GITEX.
Edit: cannot reply below so answering here
Cybersecurity conferences.
DEFCON/Blackhat happen during the same week, so you have a lot of script kiddies who lack common sense trying to pwn random workloads. They almost always get caught (and charged - happens every year), but it's a headache.
GITEX is MENA and Asia's largest cybersecurity conference. You have intelligence agencies from most of the Middle East, Africa, Europe, and Asia attending, plus a lot of corporate espionage because of polticially connected MSSPs as well as massive defense tenders.
8 replies →
AFAIK this current case has absolutely nothing to do with any form of chat features, it’s about telegram’s public channels that more or less work like reddit/twitter/any other news channels, except it refuses to censor content.
> They probably should implement E2EE for everything
He explained in his blog why he doesn't like E2EE:
https://telegra.ph/Why-Isnt-Telegram-End-to-End-Encrypted-by...
Why Isn’t Telegram End-to-End Encrypted by Default?
Pavel Durov August 15, 2017
1 reply →
All the encryption stuff is just a red herring to a larger degree. It’s not the technical access to the information that is the issue, it is that people can share and exchange information that the various regimes do not want shared that is the primary issue. They want censorship, i.e., control of thought and speech, arresting the information flow.
They know what is being said and that’s what they want to arrest, that information can be sent and received. And by “they” I mean more than just the French. That was just coincidental and pragmatic.
The French state does not operate that quickly on its own, to get an arrest warrant five minutes after he landed and execute on it immediately. That has other fingerprints all over it in my view.
> Does Telegram let them see it: I don't think so.
I do think so: https://archive.is/M5zw4
Also, 'exile' https://istories.media/en/news/2024/08/27/pavel-durov-has-vi...
> They probably should implement E2EE for everything
Certainly not because then Telegram would lose alot of its functionality that makes it great. One thing that I really enjoy about Telegram is that I can have it open and synched across many independent devices. Telegram also has e2e as an option on some clients which cant be synched
21 replies →
Either Telegram will let them see it, or Telegram's CEO will go to jail. Telegram's CEO doesn't want to go to jail, so Telegram will let them see it.
they probably share it with russian authorities. Just look now. russia is allowing protests in favour of him (they only allow protest they support) and they arrested a french citizen on fake drug charges right after
Will they let _US_ law enforcement see it? No. Will they let Russian? Of course.
5 replies →
Do you have some info about Durov being arrested for not letting law enforcement see encrypted messages? The public info says he was arrested for "...lack of moderation, ...[and] failing to take steps to curb criminal uses of Telegram."
I don't see anywhere saying he's been arrested for anything to do with encryption or cooperating with investigations.
eg https://www.bbc.co.uk/news/articles/ckg2kz9kn93o but pretty much all the sources I have read say the same
Well of course, but this is a feature of Telegram. It's the only messaging app where messages are stored on the cloud. This of course has security implications, but also allows you to have a big number of chats without wasting your device memory like WhatsApp does, or having to delete old conversations, and allows you to access your chats from any device. By the way you can also set a password to log in from another device (two factor authentication, also on WhatsApp now you have this option).
To me it's a good tradeoff, of course I wouldn't use Telegram for anything illegal or suspect.
> It's the only messaging app where messages are stored on the cloud.
Besides Slack and Discord and Teams and whatever the heck Google has these days and iMessage and...
I think you mean it's the only messaging app that purports to have a focus on security where messages are stored in the cloud, which is true, but also sus. There's a reason why none of the others are doing it that way, and Telegram isn't really claiming to have solved a technical hurdle that the E2E apps didn't, it's just claiming that you can trust them more than you can trust the major messaging apps.
Maybe you can and maybe you can't, the point is that you can't know that they're actually a safer choice than any of the other cloud providers.
16 replies →
But that's literally the entire point of this article. That is, in this day and age, when people talk about "secure messaging apps" they are usually implying end-to-end encryption, which Telegram most certainly is not for the vast majority of usages.
27 replies →
I think a high definition photo taken on a recent phone takes up an awful lot more device memory than a "big number of chats"
3 replies →
This is such a misrepresentation. Telegram could at-will feed the cloud-2FA password to password hashing function like Argon2 to derive a client-side encryption key. Everything could be backed up to the cloud in encrypted state only you can access. Do they do that? No.
So it's not as much as trade-off, as it is half-assed security design.
19 replies →
> It's the only messaging app where messages are stored on the cloud
Unreal. Please share how you came to this world view.
1 reply →
> Well of course, but this is a feature of Telegram. It's the only messaging app where messages are stored on the cloud.
Wrong, Matrix does it too, but fully e2ee.
> and allows you to access your chats from any device.
No it doesn't, because it is possible withh e2ee as well
> It's the only messaging app where messages are stored on the cloud.
Instagram. FB Messenger. Skype. LINE. KakaoTalk. Discord. Slack. Teams. iMessage.
1 reply →
You never know what may suddenly become illegal.
>It's the only messaging app where messages are stored on the cloud.
So do all the others with the exception of something like IRC.
4 replies →
That's it. The article could be just that. You log back in and all your messages are there without you having to provide a secret or allow access to some specific backup? Your data just lives on the server. The only thing preventing anyone from accessing it is the goodwill of the people running the server.
Not true. Secret chats only live on a device where you started it. Regular people may not use them (their problem), but these are common for business-critical chats in my circles.
Indeed and this is the other thing - even if Telegram don't themselves co-operate with law enforcement, it'd be fairly easy for law enforcement to request access to the phone number from the carrier, then use it to sign into the Telegram account in question and access all of the messages.
You can set a password that’s required to authenticate a new device.
Once that’s set, after the SMS code, then (assuming you don’t have access to an existing logged in device because then you are already in…), you can either reset the password via an email confirmation _or_ you can create a new account under that phone number (with no existing history, contacts, etc).
If you set a password and no recovery email, there is no way for them to get access to your contacts or chat history barring getting them from Telegram themselves.
If you apply this test to things like LastPass or Bitwarden they fail too. And yet the don't keep my unencrypted passwords on their servers.
If you lose your Bitwarden master password you've lost your data. It passes the mud puddle test.
1 reply →
I'm probably dumb, but why would that be proof?
I upload encrypted backups to a cloud service provider (AWS, Google Cloud). I go to another computer, download them, use a key/password to decrypt them.
Sure, I get it, you're typing in something that decrypts the data into their app. That's true of all apps including WhatsApp, etc... The only way this could really be secure is if you used a different app to the encryption that you wrote/audited such that the messaging app never has access to your password/private key. Otherwise, at some point, you're trusting their app to do what they claim.
> > using the password recovery flow
> use a key/password
The previous poster intentionally mentioned password recovery flow. If you can gain access without your password, than law enforcement can too. If you could only gain access with your password, you could consider your data safe.
1 reply →
Offhand, this sounds like a terribly insecure workflow but...
Client creates a Public Private key pair used for E2EE.
Client uses the 'account password (raw)' as part of the creation of a symmetric encryption key, and uses that to encrypt and store the SECRET key on the service's cloud.
NewClient signs in, downloads the encrypted SECRETKeyBlob and decodes using the reconstructed symmetric key based on the sign in password. Old messages can then be decoded.
-- The part that's insecure. -- If the password ever changes the SAME SECRET then needs to be stored to the cloud again, encrypted by the new key. Some padding with random data might help with this but this still sounds like a huge security loophole.
-- Worse Insecurity -- A customer's device could be shipped a compromised client which uploads the SECRET keys to requesting third parties upon sign-in. Those third parties could be large corporations or governments.
I do not see how anyone expects to use a mobile device for any serious security domain. At best average consumers can have a reasonable hope that it's safe from crooks who care about the average citizen.
> When you regain consciousness you'll be perfectly fine, but won't for the life of you be able to recall your device passwords or keys
You can't use your password as input to the mud puddle test.
1 reply →
I know this is getting off-topic, but all the discussion about encryption missing an important weakness in any crypto algorithm - the human factor.
I found it interesting that countries like Singapore haven’t introduced requirements for backdoors. They are notorious for passing laws for whatever they want as the current government has a super majority and court that tends to side with the government.
Add on top Telegram is used widely in illegal drug transactions in Singapore.
What’s the reason? They just attack the human factor.
They just get invites to Telegram groups, or they bust someone and force them to handover access to their Telegram account. Set up surveillance for the delivery and boom crypto drug ring is taken down. They’ve done it again and again.
One could imagine this same technique could be used for any Telegram group or conversation.
Would love to see a side-by-side comparison of iMessage, Signal, WhatsApp and Telegram on this.
You already know how Signal is going to come out here, because this is something people complain incessantly about (the inconvenience of not getting transcripts when enrolling new devices).
3 replies →
Here it is: https://www.securemessagingapps.com/
Matrix doesn't allow this. You need a dedicated chat key in addition.
Also the same with Skype "encryption". The data is "encrypted", but you receive the private key from the server upon sign-on... So, just need to change that password temporarily.
How to do that on initial account creation:
- locally create a recovery key and use it to wrap any other essential keys
- Split that or wrap that with two or more keys.
- N - 1 goes to the cloud to be used as MFA tokens on recovery.
- For the other, derive keys from normalized responses to recovery questions, use Shamir's secret sharing to pick a number of required correct responses and encrypt the Nth key.
You can recover an account without knowing your original password or having your original device.
IOW, you've made the recovery questions into alternate passwords, passwords that law enforcement is likely able to find or brute force.
Telegram has an answer to this: https://telegram.org/faq#q-do-you-process-data-requests - only Secret Chats are e2e encrypted.
As an alternative, Signal or Jami conversations are always e2e encrypted.
Unless you can prove (e.g. using your old device or a recovered signing key) that the new device is yours. In that case, if the service supports it, the new device could automatically ask your contacts to re-send the old messages using the new device's public key.
Telegram has secure calls and secure e2e private chats. All other chats are cloud-backupped. So if you have an intent of using private communication - the answer is "no", if you don't care - the answer is "yes"
Unfortunately if the answer is no, it does not mean law enforcement can’t
Why not the "founder locked up" test? If the founder claims secure encryption, yet they are not in jail, that means there's no secure encryption because they negotiated their freedom in exchange for secret backdoors.
Maybe, but not a good litmus test. If it’s truly secure and the founder can’t provide information because they don’t have access to it it’s also possible they can’t build a case in most countries.
1 reply →
That isn’t applicable here. Telegram isn’t encrypted and yet they refused to comply with subpoenas. Companies whose customer data is encrypted can truthfully say that they have no way to access it for law enforcement. Telegram can’t.
Maybe in the future, creators of encrypted messaging apps will get locked up. I certainly hope not. But this case doesn’t indicate anything one way or another.
8 replies →
Yeah, and the only way to get government to learn about why e2ee is important is to show them that if law enforcement can get it, then so can hackers/phishers. We need as many politicians dark secrets hacked and ousted as possible. It should be a whistblower protected right codified into law to perform such hacks
In my opinion, Telegram is more of a social network than a messenger. There are many useful channels and in many countries, it plays an important role in sharing information. If we look at it from this point of view, e2ee does not seem very important.
We should also not forget that, in the time when all social media (Reddit, X, Instagram etc.) close their APIs, Telegram is one of the only networks that still has a free API.
That's the dangerous part. It's a messaging app that took in the function of a social media platform. It did so without robust security features like end-to-end encryption yet it advertised itself as heavily encrypted. Like Green stated in his blog post, users expect that to mean only recipient can read what you say, i.e. end-to-end encryption.
Telegram would be fine if it advertised itself as a public square of the internet, like Twitter does. Instead, it lures people into false sense of security for DMs and small group chats, which is what Green's post and thus this thread is ultimately about.
Free API doesn't mean anything until they fix what's broken, i.e. provide meaningful security for cases where there's reasonable expectation of it.
> a social media platform. It did so without robust security features like end-to-end encryption
Most social media platforms doesn't support e2ee.
Some chat apps do support e2ee but also requires a god damn phone number to login (yeah so does telegram), this makes "encryption" useless because authorities just ask the teleco to hand out the login SMS code.
1 reply →
> It did so without robust security features like end-to-end encryption yet it advertised itself as heavily encrypted.
Telegram has E2E encryption, but only in Secret Chats: https://telegram.org/faq#secret-chats
2 replies →
The free API is amazing I have so many little helper bots that help me automated my life. It's easy better easier and more feature rich than twilio or slack. I made my own stock management bot that ate a screener spreadsheet I upload in the chat and tell me if I should sell my stocks.
There is even that freqtrade bot that runs on telegram, even RSS bots. It really is amazing. So easy to use for chat ops.
I don't know what else you would use the API for.
Most "normal" people use messaging app and social medias DM interchangeably.
For instance 2 days ago my partner wanted to show me a message her friend sent, went to whatsapp and couldn't find it then realized said friend had used instagram DM for that. Most people don't care enough.
> It's a messaging app that took in the function of a social media platform. It did so without robust security features like end-to-end encryption yet it advertised itself as heavily encrypted.
Do you want to say that social networks must implement E2E? Personally I think it is a good idea, but existing social networks and dating apps do not implement it so Telegram is not obliged to do it as well.
As for promises of security, everybody misleads users. Take Apple. They advertise that cloud backups are encrypted, but what they don't like to mention is that by default they store the encryption keys in the same cloud, and even if the user opts into "advanced" encryption, the contact list and calendar are still not E2E encrypted under silly excuse (see the table at [1]). If you care about privacy and security you probably should never use iCloud in the first place because it is not fully E2E encrypted. Also note, that Apple doesn't even mention E2E in user interface and instead uses misleading terms like "standard encryption".
This is not fair. Apple doesn't do E2E cloud backups by default and nobody cares, phone companies do not encrypt anything, Cloudflare has disabled Encrypted Client Hello [2], but every time someone mentions Telegram, they are blamed for not having E2E chats by default. It looks like the bar is set different for Telegram compared to other companies.
[1] https://support.apple.com/en-us/102651
[2] https://developers.cloudflare.com/ssl/edge-certificates/ech/
1 reply →
What is your definition of a social network?
It’s not encrypted by default, and even if it were encrypted, you should never trust any connected device with anything important. That being said, Telegram is hands down the best communication platform right now. It is feature-rich, with features implemented years ago that are only now being added to other platforms. It has normal chatting/video calls, groups, channels, and unlimited storage in theory, all for free. I just hope it doesn’t go downhill after what happened these last days because there’s no proper replacement that fulfills all Telegram features at once.
What's in Telegram that you don't see in Signal? Honest question, I only use Signal rather than Telegram.
Signal has probably the worst UX of any messaging app. It also used to require sharing phone numbers to add contacts, which imo is already a privacy violation.
Telegram is fast, responsive, gets frequent updates, has great group chat, tons of animated emojis, works flawlessly on all desktop and mobile platforms, has great support for media, bots, and a great API, allows edits and deleting messages for all users, and I really like the sync despite it not being e2e.
21 replies →
The worst UX you can provide. Clumsy, slowly switching views, search worse than on WhatsApp, stickers like from 2005, no formatting, no bot API (of course there are few "hacked" ones implementations, but is it really the way?), margin and padding bloated UI.
# No smooth animations - that's makes Telegram stand out from everything else here, but maybe not everyone is happy when 6-core phones can deliver something more than 60fps in 2024...
That's what I remember and yes - mostly those are probably easy to fix UI/UX features/bugs, but even being open-source - they aren't.
Telegram is great for large groups. It's better to compare Telegram to Reddit than Signal.
Signal is excellent for tiny groups of known participants. I prefer it over anything else for this use case. The group permissions Signal introduced a few years ago are well suited for that purpose. I've recently started running small groups on Signal with about 100 participants who mostly know each other, but not tightly. The recent addition of phone number privacy makes this feasible.
Once you start moving up in scale you really need moderation tools, and Signal doesn't do so well there. When you have thousands of people and it's open to the public you need to moderate or else bad actors will cause your valuable contributors to leave. Basic permissions like having admins who can kick people out and restricting how new members can join only gets you so far.
The issue is that in Signal there is no group as far as the server is concerned: The state of the group exists only on client devices and is updated in a totally asynchronous manner. As a consequence it is more difficult for Signal to provide such features. For example, Signal currently has no means to temporarily mute users, to remove posts from all group members, easy bots to deal with spam, granting specific users special privileges like ability to pin messages, transferable group ownership as opposed to a flat "admin" privilege, etc.
Think about the consequences of Signal's async nature with no server state: What does it mean to kick someone out? An admin sends a group update message that tells other clients to stop including that user in future messages. Try this: Have a group member just delete Signal and then re-register. Send a message to the group. They're still in the group. You get an identity has changed message. These are really only actionable with people who you know... that is, in tiny groups.
And then, the biggest strengths of Signal, which are its end to end encryption and heroic attempts to avoid giving the server metadata, are less valuable in the context of a large public group: Anyone interested in surveilling the group can simply join it, so you have to assume you're being logged anyway. Signal lacks strong identities as a design choice, so in big groups it's harder to know who you're really talking to like you know that "Joe Example, founder of Foo Project" is @Foo1988 on Telegram and @FooOfficial on X and u/0xFooMan on Reddit.
This is one of those questions where it's hard to answer but it's obvious once you use it.
What's the difference between a fiat and a ferrari? What's the difference between CentOS and Linux Mint? What's the difference between a macdonalds and a michelin burger?
I have friends and groups on both platforms. On Signal, I'm basically just sending messages (and only unimportant one, like, when are we meeting. Sending media mostly sucks so I generally only have very dry chats on Signal).
Whereas on Telegram, I'm having fun. In fact it's so versatile, that my wife and I use it as a collaborative note-taking system, archiver, cvs, live shopping list, news app (currently browsing hackernews from telegram), etc. We basically have our whole life organised via Telegram. I lose count of all the features I use effortlessly on a daily basis, and only realise it when I find myself on another app. This is despite the fact that both Signal and whatsapp have since tried to copy some of these features, because they do so badly. A simple example that comes to mind: editing messages. It took years for whatsapp to be able to edit a message (I still remember the old asterisk etiquette to indicate you were issuing a correction to a previous message). Now you can, but it's horrible ux; I think you long press and then there's a button next to copy which opens a menu where you find a pencil which means edit, or sth like that. In telegram I don't even remember how you do it, because it's so intuitive that I don't have to.
Perhaps that's why I find the whole "Telegram encryption" discussion baffling to be honest. For me, it's just one of Telegram's many extra features you can use. You don't have to use it, but it's there if you want to. I don't feel like Telegram has ever tried to mislead its users that it's raison d'etre is for it to be a secret platform only useful if you're a terrorist (like the UK government seems to want to portray it recently).
I get the point about "encryption by default", but this doesn't come for free, there are usability sacrifices that come with it, and not everyone cares for it. Insisting that not having encryption by default marrs the whole app sounds similar to me saying not having a particular set of emojis set as the default marrs the whole app. It feels disingenuous somehow.
7 replies →
> What's in Telegram that you don't see in Signal?
The first feature that comes to mind for me is being able to use multiple devices. Signal only allows using it with one phone. If you add a second device, the first one stops working. You can use a computer and a phone, but not multiple phones. Telegram supports this without any issues. I still struggle to understand this limitation.
2 replies →
User base, large groups (I think the max is 200k members), channels, bots to automate work, animated stickers, video messages (not the calls one), and video/voice calls within the group (not sure if Signal has that), file storage and file sharing, multiple devices without worrying about losing messages -and you might mention the security part and that’s ok, I want the accessibility, if I want security I will look somewhere else- among other features. Those are on top of my head.
Cross-device message history for me. I can go back to my very first message sent. Signal to this day sucks for message history.
People.
Signal doesn't provide a web app, unlike Telegram.
For me, that I can just do apt install telegram-desktop
Good desktop client.
Polls
As far as I see there was no criticism targeted at anything else than the encryption part.
The worst thing is that almost every non-techie who uses Telegram thinks Telegram in general is e2ee.
Anecdotal evidence, so take this with a grain of salt - I work with a bunch of people from Ukraine and almost all of them exclusively use Telegram to keep up with the news and family back home. From talking to them for a while, it's mostly because it's free, has excellent support for sync across multiple devices (including audio, video and other media), has support for proxies to circumvent any kind of blocking, public channels for news updates.
Honestly it would be better if Telegram dropped the facade of having E2EE. It's generally very low on the priority list of most people anyway, as much as it would hurt anyone reading this, but that's the truth. People are not using it for secure messaging, but for a better UX and reliability.
EDIT: Telegram does require a phone number to sign up.
Ideally they should really use something like jami. https://jami.net/
> doesn't require any personal identifier
Do they still not require ID when you buy a SIM card in Ukraine?
10 replies →
Not a single person I know who uses Telegram cares about or thinks of it as e2ee. Whether "techie" or "non-techie" (whatever the definition of that is). People use it because it has a nice interface, was one of the first to have good "sticker" message support (yes, a lot of people care about that kind of stuff), and of course because of the good old network effect.
It's only on HN I ever see people set up Telegram as some supposed uber-secure private app for Tor users and then demolish that strawman gleefully.
Do you read other news sites that mention Telegram or is this an N=1 situation?
Today, on the same topic, another tech site which generally gets a lot of things right (but whoever is responsible for writing about Telegram, or maybe their internal KB, is consistently wrong and doesn't care about feedback) wrote that it is an encrypted chats service: https://tweakers.net/nieuws/225750/ceo-en-oprichter-telegram... ("versleutelde-chatdienst" means that for those fact checking at home)
1 reply →
You could also ask about whether they think it's private. And if they say yes, ask them what it means. Does it mean only sender and intended recipients can read the message, or is it fine if the service has someone check the content. Would they agree on the notion "it's OK my nudes I send to my SO are up for grabs for anyone who hacks Telegram's servers", or do they think should Telegram plug this gaping hole.
Also, people tend to state they have nothing to hide, when they feel they have nothing to fight with. But I can't count the number of times I've seen a stranger next to me on a bus cover their chat the second I sit next to them. Me, a complete random person with no interest in their life is a threat to them.
7 replies →
For the past few weeks I've been using Telegram to create my own cool sticker and when talking with people in whatsapp (eughh) I find myself having trouble finding the words my telegram stickers would mean
Telegram is mostly used by people in the US for drug deals and chatting with people in Eastern Europe, so it's very common to believe it's a secure messenger.
Amplified by journalists, and most frustratingly to me even some techies that just can't be bothered to properly examine all available facts despite their technical capabilities to examine them.
100% this. most people do not realize that all those non-secrete messages from private chats and group chats are stored in database that people at telegram has access to.
I’d guess (not gonna test it but it feels reasonable) that “almost every non-techie” has a very vague idea of what e2ee even is, so it’s not clear where the worst part comes from. Pretty sure the best ideas they have about security are from hacker movies best case on average.
Because Telegram is E2EE, but only in Secret Chats: https://telegram.org/faq#secret-chats
not everybody understands that "encrypted" =/= "end-to-end encrypted".
the perceived secure nature of telegram has been memorialized in mainstream rap, courtesy kendrick lamar in 2017 (https://genius.com/11665524).
BS. Vast majority of non-tech users do not, for a simple reason that they can't know it even if they cared, and they do not. Even tech users can't be bothered to read links to the faq on tg site.
There is so much misinformation around telegram that alone made me trust it more (if a known liar tries to discredit something, it increases chances of it being good--it is about comments here on HN).
I am null at cryptography but thie following does not sound too bad as a default tbh. And I think it is misleading to focus solely on e2ee and not mention the distributed aspect.
https://telegram.org/faq#q-do-you-process-data-requests
> To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions. The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.
> Thanks to this structure, we can ensure that no single government or block of like-minded countries can intrude on people's privacy and freedom of expression.
> Telegram can be forced to give up data only if an issue is grave and universal enough to pass the scrutiny of several different legal systems around the world.
> To this day, we have disclosed 0 bytes of user data to third parties, including governments.
You can coherently argue that encryption doesn't matter, but you can't reasonably argue that Telegram is a serious encrypted messaging app (it's not an encrypted messaging app at all for group chats), which is the point of the article. The general attitude among practitioners in the field is: if you have to reason about how the operator will handle legal threats, you shouldn't bother reasoning about the messenger at all.
[flagged]
10 replies →
> if you have to reason about how the operator will handle legal threats, you shouldn't bother reasoning about the messenger at all.
That's true.
You need to run your own platform people. XMPP is plenty simple, plenty powerful, and plenty safe -- and even your metadata is in your control.
Just self host. There's no excuse in 2024.
Wake up people!
Why should the arrest of someone else affect YOU?
14 replies →
Yes: End-to-end encryption is technically quite difficult, but politically and legally feasible (at least currently, at least in most countries).
Simply not cooperating with law enforcement is technically moderately difficult, but politically and legally impossible.
Between a difficult and an impossible option, the rational decision is to pick the difficult one.
Indeed. Even being charitable and assuming that they're not lying (they say elsewhere that they've shared zero bytes with law enforcement, despite this being demonstrably false), in reality if say, they were to arrest the founder in an EU country (France, perhaps), all they need to do is threaten him with twenty years in prison and I'm sure he'll gladly give up the keys from all the different locations they supposedly have.
Is there a nice solution for multiparty (n >= 3) end-to-end encryption?
16 replies →
I wonder if this is practically relevant at all.
Given that users can access their messages without interaction with people at Telegram, automatic aggregation of the cloud data for single end points is in place.
In consequence the data can be accessed from a single jurisdiction anyways.
Wouldn’t being forced to give up the password and logging in be a violation of the 5th amendment, at least in the US? I think it’s a mixed bag of rulings right now, but it seems like it would make sense for it to fall that way at the end of the day.
1 reply →
The problem with this approach is that it relies on governments accepting your legal arguments. You can say "no, these are separate legal entities and each one requires a court order from a different country" all you want, but you also need to get the courts themselves to agree to that fact.
Problem with this claim is that it's hardly verifiable. Telegram's backend is closed source, and the only thing you can be sure of is that their backend sees every message in plaintext.
[flagged]
16 replies →
Maybe hijack the key and message before it gets distributed. Or just get after the pieces themselves if they are from Chinese or Russian authorities. Or just threaten to close the local data center if they do not collect the pieces from elsewhere, see if they can be convinced to hand over what they have, regardless where they put it.
We can be null in cryptography, but handing over both the secret and the key to this secret to the very same person is quite a trustful step, even when they say 'I promise I will not peek or let others peek, pinky promise!' - with an 'except if we have to or if we change our mind' in the small prints or between the lines.
https://www.spiegel.de/netzwelt/apps/telegram-gibt-nutzerdat...
> Translated: Contrary to what has been publicly stated so far, the operators of the messenger app Telegram have released user data to the Federal Criminal Police Office (BKA) in several cases.
https://torrentfreak.com/telegram-discloses-user-details-of-...
> Telegram has complied with an order from the High Court in Delhi by sharing user details of copyright-infringing users with rightsholders.
Anyways just some examples in which their structure doesn't matter. In the end, user data is still given away. It's also why e2ee should be the sole focus. Everything else is "trust me bro it's safe" levels of security.
>To protect the data that is not covered by end-to-end encryption, Telegram uses a distributed infrastructure. Cloud chat data is stored in multiple data centers around the globe that are controlled by different legal entities spread across different jurisdictions.
This is utter bullshit I debunked back in 2021.
https://security.stackexchange.com/questions/238562/how-does...
In practice also didn't work, only one government was needed to arrest the guy. And now all they need is a hammer or some pliers. No need for multiple governments to coordinate.
1 reply →
> The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data.
Or the CEO and owner, staring down the barrel of a very long time in prison, obtains the keys from his employees and provides them to the authorities.
Would he do this? To me, it matters little how much I trust someone and believe in their mental fortitude. I could instead rely on mathematical proofs to keep secrets, which have proven to be far better at it than corporations.
I am wondering if there was any incident that disproved the “we have disclosed 0 bytes of user data to third parties, including governments.” statement.
Yes, https://www.androidpolice.com/telegram-germany-user-data-sur...
Splitting stuff between multiple companies doesn't really protect anyone if the boss of all companies is held hostage.
Also
> To this day, we have disclosed 0 bytes of user data to third parties, including governments.
Didn't they conclude an agreement with Russian gvt in 2021?
Clearly the investigating authorities are not buying that argument because, well, it's completely absurd. Both technically and legally, Telegram are in control of those keys, regardless of where they are hosted.
> Telegram can be forced to give up data
That's all you need to know. Matrix and Signal can't be forced in any way.
The admins of Matrix instances sure can be forced to give up data. The metadata is not encrypted, and many rooms are not either.
5 replies →
That’s Telegram's CEO saying how he and his employees were “persuaded and pressured” by US FBI agents to integrate open-source libraries into Telegram (1).. There are a lot of questions to ask, like if the open-source libraries are indeed compromised, among other things. I take it as this arrest was the final straw to pressure him to give up and hand over some “needed” data, as all the accusations I read are laughable. Instagram is full of human trafficking and minor exploitation, drug dealers, and worse. The same goes with other social media, and I don’t see Elon or Zuck getting arrested. I am confident that this arrest is to obtain specific information, and after that, he will be released, or spend 20 years if he doesn’t comply.
(1) https://youtu.be/1Ut6RouSs0w?t=1082
Or he's trained in the art of lying
"At St. Petersburg State University, Mr. Durov studied linguistics. In lieu of military service, he trained in propaganda, studying Sun Tzu, Genghis Khan and Napoleon, and he learned to make posters aimed at influencing foreign soldiers."
https://www.nytimes.com/2014/12/03/technology/once-celebrate...
You really think the FBI would casually go to Durov and start telling him which libraries to deploy in his software.
This "They're trying to influence me that means its working" 5D-chess is the most stupid way to assess security of anything.
There's nothing to backdoor because it's already backdoored:
Code does not lie about what it does. And Telegram clients' code doesn't lie it doesn't end-to-end encrypt data it outputs to Telegram's servers. That's the backdoor. It's there. Right in front of you. With a big flashing neon light says backdoor. It's so obvious I can't even write a paper about it because no journal or conference wouldn't accept me stating the fucking obvious.
I do wonder if this would hold up though, if telegram stored each character of your chat in a different country, would a single country not be able to force them to hand over the data and either fine them or force them to stop operating if they wouldn't share the full chat? It seems like a loophole but I don't know what the precedent is.
Telegram offers end-to-end encryption in the same way that McDonalds offers salads.
Via a touchscreen? :P
yes. in that if you want it it's there, but nobody's forcing it on you if you just want a burger.
Oh, I must have missed this. Please tell me how to enable secret chats for groups. And my desktop chats. Also I'd like to turn on the setting for defaulting to secret chats whenever I open a new one. Oh? I can't. Sounds like it's not there if I want it, after all. Good thing they didn't force it to me though /s
13 replies →
Technically but not practically.
Expired from the day before, but with a fresh date sticker on it?
Overly chilled?
I love the comparison, stealing it.
In opposition to something French?
I don't know why people get hung up on Telegram's encryption. Maybe they're trying to make it be something it isn't.
Is Discord end to end encrypted, is IRC? Nope, does it make them useless? Again no.
Same with Telegram, it's a chat tool where you can select your audience and have a good UX with native bot support. (like Discord and IRC).
That's what I want, nothing more.
If I want to plan a coup, I'd use something else of course.
It's because Telegram is marketing itself as a secure messaging app, and because journalists continuously present it as such while discussing the arrest of its CEO.
I've only heard telegram presented as a messenger for criminals in western media.
3 replies →
because on their front page in giant font they call themselves private and secure and outright say “heavily encrypted”.
it’s their own fault. a better question might be:
why do they keep over and over crying when people call them out for endangering their users? it’s super odd.
Exactly this. It is all about how they market themselves. If they had promoted themselves as a social media-ish platform, nobody would be causing a fuss about their encryption.
Neither discord, nor any of the popular IRC clients (HexChat, WeeChat, mIRC) even mention the word security or privacy to promote their products.
Moreover, as Mathew Green mentioned in his blog post, there are many instances where Telegram (or Pavel Durov) has gone out of his way to attack the encryption offered by Signal and WhatsApp. If he were pitting his messenger against discord, why would he be worried about Signal or WhatsApp?
Thanks for the blog post, now I finally have a good resource I can point people to next time they claim Telegramm is secure.
> I am not specifically calling out Telegram for this, since the same problem [with metadata] exists with virtually every other social media network and private messenger.
Notably, Signal offers a feature called Sealed Sender[0]. While it doesn't solve the metadata problem entirely, it does at least reduce it a bit.
[0]: https://signal.org/blog/sealed-sender/
Sealed sender doesn't really solve the metadata problem at all:
* https://www.ndss-symposium.org/wp-content/uploads/ndss2021_1...
Generally you need something like TOR to hide who is talking to who.
Interesting, I feared Sealed Sender might be susceptible to statistical analysis (hence my phrasing "reduce it a bit") but it's worse than I expected ("Signal could link sealed sender users in as few as 5 message"). Thanks for the link!
As for TOR, that wouldn't really help much, would it, given that the described attack is at the application level of Signal. Or are you talking about not using Signal altogether?
2 replies →
Related presentation from Network and Distributed System Security Symposium:
https://www.youtube.com/watch?v=HoN6FLC5Hss
1 reply →
With Matrix, you can use your own (or trusted) server. Doesn't it solve the problem with the metadata? At least when two trusted servers interact.
This is part of what I love about Mastodon: if you PM someone, very often you're talking between two random servers and odds are good that the admin is a friend of a friend. No dragnet statistical analysis stuff, just friends running some software that normal people can also use. Distributed systems at their best
1 reply →
If telegrams encryption is so bad why is Pavel Durov under arrest?
The arrest cites that he was not cooperating with authorities to crack down on various drug illegal activities on telegram. None of the other social networks have their ceos arrested. Is it simply that telegram is the only one without backdoors for five eyes?
It seems to me the secret chat feature actually works too well?
> If telegrams encryption is so bad why is Pavel Durov under arrest?
He's under arrest precisely because it is bad enough that Telegram is in a position to share data with law enforcement, but it chooses not to.
Or maybe he is sharing with the other guys.
1 reply →
I'd suggest waiting for more details from French officials, they have already said that they'll address it tomorrow. So far claims from the media sound like Durov's being prosecuted due to very little moderation on the platform, not because of E2EE.
Even so, most messages sent on Telegram are plaintext, they're encrypted only in transport layer, but Telegram's servers see them in full. Secret chats (the only E2EE chats on Telegram) are hidden away from the users, hence the original link.
> So far claims from the media sound like Durov's being prosecuted due to very little moderation on the platform, not because of E2EE.
But that's why it's good. With all the mainstream media censoring stuff, telegram was a (good for the people) exception.
On the other hand, that's probably why they arrested him.
1 reply →
> Even so, most messages sent on Telegram are plaintext, they're encrypted only in transport layer, but Telegram's servers see them in full.
you contradict yourself in the same sentence
1 reply →
Read this: https://fortune.com/crypto/2024/06/27/telegram-dark-net-blac...
Telegram channels are public, unencrypted web shops for all kinds of illegal goods. I guess the French government alleges that Durov is not doing enough to stop these activities on his platform.
It doesn't necessarily have anything to do with encryption.
It indirectly has a lot to do with encryption, in that if Telegram was actually encrypted, they'd probably have no grounds on holding him in the first place.
(At least at the moment, in most countries) it's not illegal to not ship a backdoor in your end-to-end-encrypted software upon government request, but in most it is illegal to not share data you're holding in a form accessible to you when you receive a warrant for it.
4 replies →
The difference between telegram and others is that in telegram you can type "<city> drugs" to global search and find groups with drug dealers and buyers near you instantly. I don't think his arrest has anything to do with the level of encryption at all.
Personally I find Telegram kind of refreshing in nowadays internet landscape where everything is so sanitized. You can discover all kinds of niches you never knew existed.
> Is it simply that telegram is the only one without backdoors for five eyes?
Do you honestly think that any backdoor would be used for such mundane crimes? Even more so, it being in any way acknowledged that there might be a backdoor?
On that topic, it's highly likely Telegram is cooperating with Russian LE. Services and people that don't get thrown out quickly in Russia.
> The arrest cites that he was not cooperating with authorities to crack down on various drug illegal activities on telegram. [...] None of the other social networks have their ceos arrested.
Because if you want to operate in any country, you're either cooperating with the authorities or you'll get shut down or arrested. Hiding evidence you have is not tolerated anywhere.
https://www.zdnet.com/article/russia-unbans-telegram/
and even eventually ended to become a major propaganda tool for the Russian army.
2 replies →
I can give you some insight into why EU law enforcement and politicians dislike telegram. It’s not because they can’t snoop on you, it’s because Telegram fails to comply with moderation requests for channels where illegal content is shared.
We had a nice scandal of sorts here in Denmark where a bunch of young men shared pictures of young women without consent. If you’re old enough to remember those old “rate this girl” web pages from the 90ies you’ll know what the pictures were used for. Basically it was a huge database on hot girls in Denmark and where they went to school. Today around 1000 young men have that on their permanent record as Facebook worked with law enforcement to catch the criminals. Telegram doesn’t do that. This was even a little more innocent that it may sound, considering the men were at least aged similar to the women they were sharing pictures of. Disgusting and illegal, but Telegram houses far worse and refuses to deal with it.
I know a lot of tech minded people are up in arms over this, but it’s really mainly about not wanting an unmoderated social network. Not because big brother is angry, but because people use it to organise bullying, share revenge porn, sell drugs and far, far, worse. There is also political factions within the EU who rants to kill encryption (though they were severely weakened when the brits left), but the anger against SoMe platforms is much more “European”. In that we (and I say this as the EU culture in general, not as in 100% of us) tend to view the people who enable bad behaviour as being participating in that behaviour. Platforms like Facebook, Twitter, Instagram and YouTube have been sort of protected by being early movers with mass adoption. Being American companies probably helps as well considering EU / US relations. Telegram never had such advantages, and is further disadvantaged by how its almost exclusively used for crime in Western Europe.
Obviously banning the platform won’t help. There will just be another platform. But then, we’ve also been losing a drug war for 50+ years even though we can’t even keep drugs out of our prisons.
Haha Facebook worked with law enforcement to catch criminals. Who works with law enforcement to catch Facebook?
> you’ll know what the pictures were used for.
Fapping on? And what's the problem with that, exactly?
1 reply →
The problem is that it never ends at protecting Danish women or kids, or "fighting terrorism".
Do you think he doesn't cooperate with Russian authorities?
I am pretty sure he does not, given all I know about him, his brother and the way Telegram is being developed.
>If telegrams encryption is so bad why is Pavel Durov under arrest?
Because it was so bad he had access to all that content, and because he had access to it, he should have moderated it, and because he didn't he's now arrested.
>Is it simply that telegram is the only one without backdoors for five eyes?
Telegram doesn't have a backdoor. Its open source client can be used to verify it leaks every group message, and every desktop message you ever send, to the service provider without ever applying secret-chat grade encryption
>It seems to me the secret chat feature actually works too well?
Well, Signal can be used to verify its end-to-end encryption is actually used everywhere, but nobody's calling for arresting Moxie or Meredith. So maybe playing 5D-chess over the news isn't working, unless you're here just to amplify this ridiculously fallacious line of thinking.
The arrest was about the expected removal of illegal and harmful content in groups, that masses see, so no enryption involved. Did you not read the news - AND the blog - in full?....
Telegram is the comms system for the Russian military.
https://www.politico.eu/article/telegram-ceo-arrest-pavel-du...
“They practically detained the head of communication of the Russian army,”
As hilarious as it sounds, it's at least partly true.
3 replies →
Please could whoever downvoted this explain why? There's plenty of evidence of this. Access to Telegram would be like cracking Enigma
1 reply →
I am amazed at the low quality comments here. Encryption really doesn’t matter as much as the trust of the app here. Any malicious app author can 100% secure encrypt everything in wire and yet leak 100% of your data to some state actor. Anything you type into the chat box is only encrypted by the app after you type and probably storing it in the clear in some local SQLite db. It gives them a whole bunch of options to mess with that plain text data. Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.
>Encryption really doesn’t matter as much as the trust of the app here. Any malicious app author can 100% secure encrypt everything in wire and yet leak 100% of your data to some state actor.
This is exactly the problem with Telegram. Telegram defaults to client-server encryption for everything, and you can't enable end-to-end encryption for anything on desktop, or group chats ever. Only 1:1 chats and calls on mobile have end-to-end encryption. Client-server encryption is exactly the "100% secure encrypt in wire". When that data arrives to the server, it's no longer encrypted, and Telegram can do whatever it wants with that data, including leaking it to some state actor (like FSB/SVR).
>Anything you type into the chat box is only encrypted by the app after you type and probably storing it in the clear in some local SQLite db.
If endpoint security is of concern, your options with networked TCBs are quite limited. Are you sure the malware doesn't have a chance to escalate its privileges and read messages in clear from RAM?
>It gives them a whole bunch of options to mess with that plain text data.
I'm looking forward to hearing about how you managed to fix this. Should we implement memory as eFuses (https://en.wikipedia.org/wiki/EFuse) to prevent editing logs? What if the user wants to delete his messages?
>Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.
E.g. with Signal android, you can pull off the APK from the device, and compare its hash against the client that was reproducibly built from the source code you have in your possession. Been there done that https://imgur.com/a/wXYVuWG
>I am amazed at the low quality comments here.
Too bad you're not exactly improving them with your nonsense.
The malicious app need not be the messaging app either. It could be your keyboard.
> malicious app author can 100% secure encrypt everything in wire and yet leak 100% of your data
Um, surely you understand the difference between piping random-looking bytes uselessly to whoever and having a readable copy of all data readily available to whoever hacks the system or applies for a sysadmin role? Or are you making the assumption that people use a closed-source client and the server can push malicious code?
> Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.
Doesn't work if you have third parties also working with the system or forking the code to work with it. It gets noticed. Your concept of "e2ee can be 100% leaked anyway" only works if you don't know what code you're running. You need to trust the community in general to uncover issues you've overlooked (in the code or build process) but that's not the same as not having encryption at all. You can't audit the servers but you can audit the client code.
> You need to trust the community in general to uncover issues
My point is that this community could just be your friendly CIA operatives running the show with a veneer of open source. Also this “community” has no liability unlike the closed platform companies.
Telegram basically have "trust me bro" security.
Even worse than Apple. They at least have some e2ee options.
Am I the only one who uses Telegram mainly for p2p e2ee audio calls? It's great for that.
I use it for friends, family and partner, videocalls and normal chat.
Sure, it may not be on the same level as Signal when it comes to security but it simply is leagues above others in terms of usability, stability and bells&whistles. It's like comparing a Ford Zephyr with a Volvo EX30.
I agree, but I wouldn’t compare Signal to a Zephyr. Classic cars have that charm and magic. I would say Signal is more like a Honda Civic; its users are loud and annoying, and yet it’s mediocre in all categories. :)
Only the secret chat is e2e encrypted. All the other chat options are not. I think calls are also not encrypted since they appear in the normal chat history not in the e2e chat.
Obviously if your phone is compromised your e2ee chat is not safe.
> Obviously if your phone is compromised your e2ee chat is not safe.
Pretty much, a lot of people think that seeing E2EE means everything is safe, which I believe gives a false sense of security. You can have your phone compromised (especially when I know your phone number, Signal I’m looking at you) or be subject to other means of attacks, exposing everything. I would rather know that this app is not secure so I don’t share anything important, while keeping secure communication to other means.
Not only that. If they want to intercept e2e chats it's possible with a MITM attack, that if you control the server it's not a difficult thing to do. Of course the users if they check the keys they see they are different, but practically no one does that.
And I think WhatsApp probably does it, otherwise why the authorities never complied that WhatsApp did not let them see the conversations?
2 replies →
Stealing someone's phone number wouldn't give you any Signal data, as all the messages have perfect forward secrecy, though, right? And all contacts would see an alert that your security number had changed. Not completely foolproof, and I would like Signal to use something other than phone numbers for accounts, but it's pretty good.
8 replies →
>You can have your phone compromised (especially when I know your phone number, Signal I’m looking at you) or be subject to other means of attacks, exposing everything.
Knowing someone's phone number doesn't automatically let you compromise their device. This is such a ridiculous argument.
>I would rather know that this app is not secure so I don’t share anything important, while keeping secure communication to other means.
This is nirvana fallacy. It's essentially saying "We should not talk about Telegram lying about its security, when in reality nothing is 100% secure". Yeah, nothing is, there's always an attack. That doesn't contribute anything of interest to the topic, it just tries to kill the criticism. And I'm saying this as someone who has worked on this exact topic for ten years: https://github.com/maqp/tfc
2 replies →
Calls seem tm be e2e encrypted: https://core.telegram.org/api/end-to-end/video-calls
No idea how secure the encryption is, but calling someone on Telegram is safer than sending texts.
Depends on who your adversary is and how much you trust their protocol (some weird homegrown thing with clever/questionable cryptographic choices, the last time I checked) and implementation. Your texts don't generally run through Telegram's infrastructure, for example.
Too bad I can't send a secure text from my Telegram desktop client. Lucky for me, there's Signal.
Only 1-1 calls are encrypted, voice chats (group calls) are not
> Obviously if your phone is compromised your e2ee chat is not safe.
Yes, and that's where the 'practical' argument pops up. With all the E2EE buzz, is it really helping in the scenarios where it's supposed to work the best?
This thread gives an overview on why Signal and other apps are not really practical: https://x.com/Pinboard/status/1474096410383421452
> The broader problem of ephemeral or spur of the moment protest activity leaving a permanent data trail that can be forensically analyzed and target individuals many years after the fact is unsolved and poses a serious risk to dissent. But E2E is not the solution to it.
> I feel like Moxie and a lot of end-to-end encryption purists fall into the same intellectual tarpit as the cryptocurrency people, which is that it should be possible to design technical systems that require zero trust, and that the benefits of these designs are self-evident
Does Telegram support E2E on anything other than Android and iOS? Last time I checked it was not available for desktop.
> One of the biggest privacy problems in messaging is the availability of loads of meta-data — essentially data about who uses the service, who they talk to, and when they do that talking. […] the same problem exists with virtually every other social media network and private messenger.
Is this true for Signal too? I thought it wasn’t.
Avoiding any metadata leaks without generating tons of cover traffic (to frustrate timing correlation attacks) is very hard.
Signal does indeed use an architecture (at least for chats with contacts, or optionally everyone when you enable the "sealed sender" option that makes you a bit more prone to receiving spam) where Signal doesn't know who's sending a given message from a given IP address, and only which account it's destined for.
But any entity in position to globally correlate traffic flows into and out of Signal's servers can just make correlations like "whenever Alice, as identified by her phone's IP, sends traffic to Signal, Bob seems to be getting a push notification from Apple or Google, and then his phone connects to Signal, so I think they're talking".
How accurate does the timing need to be? I imagine there must be many Bobs getting notifications around the same time. Also, if I use Signal behind a VPN is it still known that I’m talking to the Signal servers?
> But any entity in position to globally correlate traffic
Also, Signal relies on AWS, which could also perform such an attack it seems.
I would recommend reading these resources:
The Internet Is Broken: https://secushare.org/broken-internet
The Hitchhiker’s Guide to Online Anonymity: https://anonymousplanet.org/guide.html
Pointers to more resources: https://discuss.grapheneos.org/d/15005-books-or-sources-on-p...
> Is this true for Signal too? I thought it wasn’t.
It is, because you cannot use Signal without giving them your mobile phone number, and from that point onward they (and anyone they might be sharing data with) know the who/what/when, and more. My gut feeling, notwithstanding any apologist and their weak arguments, is that the design choice is exactly about the who/what/when because it's mandatory despite being entirely unnecessary from a technical perspective.
How does it follow that Signal knowing a phone number means they know who the identity that phone number represents is communicating with?
2 replies →
Of course not. The genius of Durov was in discovering that users don't really need e2ee and all the drawbacks that come with it, and that promising them that the app has really strong encryption is good enough even without actual encryption.
>Does Telegram have encryption or doesn’t it?
>Many systems use encryption in some way or another. However, when we talk about encryption in the context of modern private messaging services, the word typically has a very specific meaning: it refers to the use of default end-to-end encryption to protect users’ message content. When used in an industry-standard way, this feature ensures that every message will be encrypted using encryption keys that are only known to the communicating parties, and not to the service provider.
>From your perspective as a user, an “encrypted messenger” ensures that each time you start a conversation, your messages will only be readable by the folks you intend to speak with. If the operator of a messaging service tries to view the content of your messages, all they’ll see is useless encrypted junk. That same guarantee holds for anyone who might hack into the provider’s servers, and also, for better or for worse, to law enforcement agencies that serve providers with a subpoena.
>Telegram clearly fails to meet this stronger definition for a simple reason: it does not end-to-end encrypt conversations by default. If you want to use end-to-end encryption in Telegram, you must manually activate an optional end-to-end encryption feature called “Secret Chats” for every single private conversation you want to have. The feature is explicitly not turned on for the vast majority of conversations, and is only available for one-on-one conversations, and never for group chats with more than two people in them.
The worst is that Telegram Secret Chats are limited in functionalities, compared to the normal ones, for no reasons. Stickers set don’t work, for exemple, and that’s one of the main feature of Telegram chats.
For me Telegram is more like an uncensored Twitter slash blog platform. I use it to check out public channels for updates and that's about it. For private communication, I use Whatsapp. So, lack of e2e by default is not an issue for me at all.
Telegram is not Signal, it is a waaay better Discord
I guess you meant to say Discord is worse Telegram that was created earlier. Though obviously many groups features got into telegram somewhat at the same time as Discord gain traction.
Still not indexable, referencable, or freely readable
It's a walled-garden system which is fine for private chats between groups of friends, but Discord is increasingly being used as a place to report bugs and share information. Telegram furthermore requires signing up with a phone number which Discord did not (now, often, you need to for participating when an admin of a community aka gild aka misnomer "server" turned on that requirement)
https://xkcd.com/979/ This comic will not be understood by gamers growing up today... (Except in many cases someone posted a solution or nudged DenverCoder9 in the right direction at least; with Discord, Slack, or Telegram you'd simply never find the thread in a search engine to begin with.)
> Discord is increasingly being used as a place to report bugs and share information.
So is telegram. I'm in numerous groups with developers of linux distros and other apps. Many developers uses telegram's channels to post updates about their works.
Are there any pointers for work to try to make metadata private (I.e encrypted)?
I was recently very curious about this question and asked similar ones here:
https://barac.at/essays/on-leaving-meta
As mentioned in a comment to one of your posts, the GNUnet people have probably gone the furthest in the quest to obfuscate metadata. Unfortunately, to this day no usable messenger application has come out of this, partially because GNUnet has largely been a research project.
As for applications in use today that address the metadata problem, have a look at Signal's Sealed Sender feature: https://signal.org/blog/sealed-sender/
As for recommending Telegram for secure messages, I side with the sibling comments ("Don't").
Since you seem to focus on decentralized protocols, I should add: In practice, while we all like federated and p2p apps for the freedoms & this warm fuzzy feeling they provide us with, by default they tend to have a much greater attack surface when it comes to metadata. This is because, compared to a centralized approach, metadata is openly available to far more parties. As a result, 3-letter agencies often won't even need a warrant to get their hands on the metadata: They can simply run traffic analysis and/or participate in the network themselves.
> I was just recommending Telegram as alternative to WhatsApp
If you care about privacy and security, please don't. Defaults matter, and private chats are effectively unusable for anyone using more than one device or needing group chats. And that's not even considering their strange home-baked cryptography.
Why didn't you recommend signal?
I am recommending both. The problem is that Signal (which I use along with the other messaging apps) is that it is not feature rich as the other 2 and Signal is not popular so ppl download it just to interact with one person (Me) whereas Telegram has more user base.
Signal really needs a good bot support... that's the only thing keeping me on telegram.
Signal lost all credibility with their cryptobro bullshit
10 replies →
I know a bit about this topic.
For metadata you first want to remove the obvious identifiers, phone numbers, names. You'd want to use something like anonymous@jabbim.pl for your IM account.
Next, you'd want to eliminate the IP-addresses from server, so you'd want to connect exclusively through Tor. So you'd set the IM client proxy settings to SOCKS5 localhost:9150 and run Tor client to force your client to connect that way. This is error-prone and stupid but let's roll with it for a second.
Now jabbim.pl won't be able to know who you are, but unless you registered your XMPP account without Tor Browser, you're SoL, they already know your IP.
A better strategy is to use a Tor Onion Service based XMPP server, say 4sci35xrhp2d45gbm3qpta7ogfedonuw2mucmc36jxemucd7fmgzj3ad.onion (not a real one), and you'd register to it via IM client. Now you can't connect to the domain without Tor, so misconfiguring can't really hurt.
So that covers name and IP. We'll assume the content was already end-to-end encypted so that leaks no data.
Next, we want to hide the social graph, and that requires getting rid of the server. After all, a server requires you to always route your messages through it and the service can see this account talks to this account, then to these ten accounts, and ten minutes later, those ten accounts talk to ten accounts. That sounds like a command structure.
So for that you want to get rid of the server entirely, which means going peer-to-peer. Stuff like Tox isn't Tor-only so you shouldn't use them.
For Tor-only p2p messaging, there's a few options
https://cwtch.im/ by Sarah Jamie Lewis (great, really usable, beautiful)
https://briarproject.org/ (almost as great, lots of interesting features like forums and blogs inside Tor)
https://onionshare.org/ by Micah Lee. Also has chats between user and hoster
https://github.com/maqp/tfc by yours truly, crude UX but the security is unparalleled.
>On a side note, I was just recommending Telegram as alternative to WhatsApp
Don't. Telegram and WhatsApp both leak meatadata, but WhatsApp is always end-to-end encrypted. Telegram is practically never end-to-end encrypted. I'd use WhatsApp over Telegram any day. But given that unlike WhatsApp, Signal is open source so you know the encryption works as advertised, it's the best everyday platform. The metadata free ones I listed above are for people in more precarious situations, but I'm sure a whistleblower is mostly safe when contacting journalists over Signal. Dissidents and activists might find Cwtch the best option however.
It is weirdly fascinating that this question has to be answered on a semi-regular basis. I am not sure if it is more of an insight into humans, ephemeral nature of software or concern that something major has changed.
Or it's just nerds who are stupid and don't understand what matters in real world security for most people.
The fact that you can create a huge group and channels without sharing your phone and contacts is what made Telegram big.
You couldn't do that on WhatsApp until a few months ago. And it has been on Telegram for years. Why Hong Kong protesters used Telegram and not Whatsapp? read this: https://x.com/Pinboard/status/1474096410383421452
The fact that Telegram is massively used in both Ukraine and Russia shows that its model cannot be ignored.
I think it’s helpful because, as the author says, Telegram put effort into making you think it’s secure and Signal isn’t. As someone who's not close to this, it’s handy to have regular reminders.
It's an unfortunate reminder in that propaganda sometimes works very well.
>One of the biggest privacy problems in messaging is the availability of loads of meta-data — essentially data about who uses the service, who they talk to, and when they do that talking.
>I am not specifically calling out Telegram for this, since the same problem exists with virtually every other social media network and private messenger.
In fact, https://simplex.chat/ is the only messenger with the least amount of metadata.
This snake oil is spreading like [Herpes] Simplex .
Again, the company lies about queues (a programming technique) being a privacy feature.
The application can not get rid of the metadata of server knowing which IPs are conversing, unless the clients explicitly connect to the service via Tor. The server must always know from which connection to which connection it routes packets. It's not a network hub, it's a switch, after all.
https://cwtch.im/ and https://briarproject.org/ route everything through Tor always, and they don't have server in the middle, which means there is no centralized authority to collect metadata. It's light years ahead of what Simplex pretends to offer.
One of the biggest, more significant as well as successful Internet-scale cons of the last decades that I can think of, apparently perfectly executed too.
This article discusses a well known point about telegram. But only to techies. Vast majority of users are misled by journalists many of whom have degrees in social "science", political "science" etc. It doesn't say you need encryption that's for each person to decide perhaps for each conversation. It's need to be an educated choice.
Though it's old hat better to recycle this often so many know.
Reads like a hit piece on Telegram from a crypto expert who couldn't be bothered to explain in more than one paragraph why the app he is calling not an encrypted app (according to how he personally thinks everyone refers to when talking about encryption) actually uses some encryption technology that he's not exactly sure of but suspects is insecure.
He specifically explains what people think an encrypted app is:
>Many systems use encryption in some way or another. However, when we talk about encryption in the context of modern private messaging services, the word typically has a very specific meaning: it refers to the use of default end-to-end encryption to protect users’ message content. When used in an industry-standard way, this feature ensures that every message will be encrypted using encryption keys that are only known to the communicating parties, and not to the service provider. From your perspective as a user, an “encrypted messenger” ensures that each time you start a conversation, your messages will only be readable by the folks you intend to speak with.
So and encrypted messaging app means to people the security that an end-to-end encrypted app provides.
He then explains how Telegram is not end-to-end encrypted.
* No end-to-end encryption by default
* No end-to-end encryption for groups, not even small groups.
To add, there's no end-to-end encryption for desktop chats either. And no end-to-end encrypted cross-platform chats either.
Your post reads like dollar-store damage control team post that didn't even read the article they're trying to discredit.
Double that. The entire article reads to me as handpicked and manipulative.
TLDR: 99.95% of messages on Telegram stored as plain text on their servers and only encrypted between client and telegram server. End-to-end encryption only working for 1on1 chats, not available half of their clients and have terrible UX.
All this is just wrong. I wonder why HN likes throwing up wrong information about Telegram as fact. Is taking up 5 mins to proof these claims that hard?
> 99.95% of messages on Telegram stored as plain text on their servers and only encrypted between client and telegram server.
Wrong and OP doesn't even mention plain text. The non-E2EE client-server data is stored encrypted sparsed out in various servers to different countries. https://telegram.org/privacy#3-3-1-cloud-chats
> End-to-end encryption only working for 1on1 chats, not available half of their clients and have terrible UX.
Wrong again. I actually recently checked this for myself their official clients on Android and Linux desktop have support for MTProto 2.0. Feel free to check if other OS don't support this feature. The only clients I know where this is not enabled are the web clients.
2 replies →
Something that might be interesting in this topic - forked version [0] of telegram client made during protests in Belarus in 2020 (and appears to be actively maintained to this day). Can't vouch for it, but found it interesting.
[0] https://github.com/wrwrabbit/Partisan-Telegram-Android
That GitHub account is… interesting.
I thought this was going to be just a big "NO." like the are we X yet? pages.
The article is still complying with Betteridge's law of headlines, though :)
It probably didn't want to get detained in France.
not being a criminal is really good, I don't have to worry about any of these stuff
Does anyone have any reason to believe that Telegram's E2EE doesn't have a backdoor? Because if not, then I fail to see why it matters whether the E2EE even exists in the first place.
Telegram clients are open source. Anyone can verify that the client does the end-to-end encryption correctly.
Telegram has had its own history of really weird issues with its encryption protocol, like the IGE, 2^64 complexity pre-computation attacks, IND-CCA vulnerability and whatever the hell this was https://words.filippo.io/dispatches/telegram-ecdh/
But these are not the big issues here. The issues Green's blog post highlighted were
* Telegram doesn't default to end-to-end encryption.
* It makes enabling end-to-end encryption unnecessarily hard
* It has no end-to-end encryption for groups
Those matter gazillion times more than e.g. a slightly older primitive would.
End-to-end encryption matters because Telegram is not just a social media or Twitter wall. It's used for purposes that deserve privacy, and Telegram isn't providing.
Pavel did mention that investigation agencies tried to lure Telegram developers to use certain open source libraries.
It's no wonder why WhatsApp and other apps don't face much heat from the government, they're already with the government.
Reason to believe is that all their apps are open source and have reproducible builds:
https://core.telegram.org/reproducible-builds
Their custom encryption is questionable, but since it open source someone would find out by now if there was obvious backdoors.
you don't use telegram for encryption
you use it because you can use disposable phone number
nobody ever cares about encryption, it's a false flag
people care about no footprints
that's exactly why it was used to create civil unrest in Iran
https://www.wsj.com/articles/iranians-turn-to-telegram-app-a...
Simple question denotes whether its encrypted.....
Does cloud server store the message and key.....
If answer is yes, ITS NOT FULLY ENCRYPTED!
Sounds contrary right?
If key and message is on server any LEO org can get it....for it to be fully encrypted cloud server should never store the keys....
So how many services claiming encryption have this flaw? All....
Why do you think Telegram has shell companies to avoid gov subpeonas?
Because it knows that its encryption is faulty to real world LEO and laws as it stores the keys on the cloud which means its can be subpoenaed for those keys and messages.
>So how many services claiming encryption have this flaw? All....
Telegram is actually one of the only apps I've seen to defend their super-duper secure storage of keys online. All lies of course.
The overwhelming majority of secure messaging apps have no way to recover user data if you drop your phone in the ocean. This includes Signal, Wire, Threema, Session, Element, iMessage etc.
at the end of the day, if you run it on an iPhone, it's iOS that renders the text, and apple is routinely subpoenaed
that gives a better explanation on why telegram is safer in real world settings than whastapp or other popular messengers: https://x.com/Pinboard/status/1474096410383421452
Seems to hang on some loading screen overlay. If it fits in a toot, care to just copy it here and save people a click?
I remember having this same conversation on here nearly a decade ago. I stopped using Telegram then.
This is actually great blogpost since too many people tend to believe that Telegram is somehow more secure and private then alternatives on market.
Also it's not like Telegram dont have censorship. During last 3-4 years there was many cases where Durov blocked bots and channels that belong to protests and opposition in Russia, marked them as "fake" or just plain removed with no trace.
So it's just another case where some rich guy try to sell his own platform as some "freedom of speech" one even though it's just censored to his liking.
It's not e2e encrypted, so what? It's something the majority of users does not need, and that doesn't increase security that much given their downsides.
Of course for Telegram is much more convenient to not have end2end encryption. Given that they store everything on their servers, it means years of chat history that probably weights Gb for each user, contrary to what WhatsApp/Signal do, of course if 10 million people send eachother the same meme it's stupid to have 10 million copies of the same images on their servers just because it is end2end encrypted. They probably have a store where they index each media with its hash and avoid to have multiple copies, that is fine. This is the reason Telegram can offer you to have all your messages, including medias that can be up to 1Gb each, stored on a cloud for free.
As I user I prefer Telegram just because it's the only app that works perfectly synchronized among multiple devices (Android, Linux, macOS) with good quality native clients, without wasting space on my phone for data.
By the way, end2end encryption it's not that safe as they claim. Sure, the conversation can not be intercepted, however:
- you can put a backdoor on endpoints, that is compromise the user phone (something they do)
- you can make a MITM attack on the server (don't know if they do that, but technically possible)
- you can access the data that is backed up on other platforms (i.e. WhatsApp makes by default backups on Google Drive or Apple iCloud, trough which you can access all the conversations in clear text).
> By the way, end2end encryption it's not that safe as they claim. Sure, the conversation can not be intercepted, however: [...]
> - you can make a MITM attack on the server (don't know if they do that, but technically possible)
No it's not technically possible, by its very definition. The fundamental principle behind E2EE is that the server can be malicious or compromised all you want, but this does not impact message confidentiality or integrity.
>It's not e2e encrypted, so what? It's something the majority of users does not need, and that doesn't increase security that much given their downsides.
Privacy is a human right. Everyone needs it. And Telegram advertises itself as an encrypted messenger. For every non-expert, that means end-to-end encryption. Only me and recipient can read the message. Users expect Telegram to be more secure than WhatsApp. Telegram claims its more secure than WhatsApp, and Telegram has attacked WhatsApp over its security. WhatsApp is always end-to-end encrypted, Telegram is not. So don't go putting words into peoples mouths.
>Given that they store everything on their servers, it means years of chat history that probably weights Gb for each user
It could be stored there with client-side encryption, Telegram doesn't need to have access to that data. Also who says chats that are ephemeral in nature need to be forever accessible. I save what I need from Signal or Telegram.
>This is the reason Telegram can offer you to have all your messages, including medias that can be up to 1Gb each, stored on a cloud for free.
It's not free. It comes with the price of your human right to privacy. You should get a job at Facebook with this marketing pitch.
>As I user I prefer Telegram just because it's the only app that works perfectly synchronized among multiple devices
It doesn't sync secret chats at all with multiple devices, not even desktop. Signal does.
>good quality native clients
Your script is seven years old https://signal.org/blog/standalone-signal-desktop/
>You can put a backdoor on endpoints, that is compromise the user phone (something they do)
Nirvana fallacy. Why is Telegram offering secret chats if all endpoints are compromised? If they're not always compromised, then it should offer end-to-end encryption for everything, always. Like Signal, Whatsapp, Wire, Threema, iMessage, Cwtch, Briar, Element, Session...
>you can make a MITM attack on the server
Which is why every messaging app worth its salt offers safety numbers https://support.signal.org/hc/en-us/articles/360007060632-Wh...
Even telegram has them, although their initial implementation of babby's first QR-code was a joke. How do you compare over the phone shades of a color matrix?
https://encrypted-tbn0.gstatic.com/images?q=tbn:ANd9GcSUnBRB...
>you can access the data that is backed up on other platform
Oh, that would be horrible. Good thing Telegram doesn't have its data backed up in cloud, no wait, sorry, it does. ~Everything you ever do with the app is permanently stored in an ecosystem built by the Mark Zuckerberg of Russia, and his PhD in geometry bro Nikolai.
Shill harder.
This is such an old topic. Every time something related to the Telegram happens, somebody starts a discussion about how it's not an e2e-by-default. But the reality is nobody cares. And considering this, it's ridiculous now that Durov is detained on the accusations of being responsible for all kinds of information that's being spread in non e2e-by-default messenger.
He's not in fact detained because information is being spread, he's detained for actively refusing to cooperate with law enforcement.
> Is Telegram really an encrypted messaging app?
If is is encrypted, then it aids terrorists and can be banned. So it is encrypted, whatever the technological details. It's a political decision.
Fascinating. I might have missed it, but I don't think the author mentioned the possibility of steganography. Just code the encrypted text such that it resembles a normal conversation.
Steganography is pointless given that encrypted and metadata protected communication is ubiquitously available to those who need it. Steganography is a niche you read about in your first year of studying the world of privacy and what you completely forget because nobody has time for spycraft when there's life to be lived. The novelty wears out faster than you can imagine.
Would you use an image for this? Is there a clever way to do this with text?
You could use an image. But you could use text as well. E.g. you could agree on a code phrase to be said when some "dirty deed done dirt cheap" has been completed. Or you could encode a binary string by alternating British English spellings with American English Spellings: e.g. "color" means 0, "colour" means 1; "gray" means 0, "grey" means 1, etc etc. and then just use those alternate spellings in a normal conversation.
4 replies →
No, it is not.
Same thing with proton mail. I have never understood the "Trust me bro we encrypt it" business model. If it's not your key on your client machine it's not encrypted.
Note that Proton Mail servers don't hold your private master key directly — it is always stored encrypted with your password. Also, Proton Mail allows you to import your keys: https://proton.me/support/pgp-key-management
They say
Well yes, but actually no.
Prime example of Betteridge's law of headlines.
Let's stop repeating this word "moderate" when what we're talking about is censorship.
Moderation is what happens here on HN: Admins have some policies to keep the conversation on track, users voluntarily submit to them.
Censorship is when a third party uses coercion to force admins to submit to them and remove posts against their will.
Durov has been arrested for refusing to implement censorship, not for anything concerning moderation.
The only difference between "moderation" and "censorship" is whether you like the policy or not.
No, it's definitely not. Moderation means I can run my group how I want, you can run your group how you want, and others can decide if they want to participate in either of our groups or start their own groups.
Censorship is when someone else dictates how we can run our respective groups.
1 reply →
I don't know how much you have used Telegram, but it's ridden with absolutely vile stuff.
You open the "Telegram nearby" feature anywhere and it's full of people selling drugs and scams. When I mistyped something in the search bar I ended up in some ISIS propaganda channel (which was straight up calling for violence/terrorism). All of this on unencrypted public groups/channels ofc (I'm pretty sure it's the same with CP, although I'm afraid to check for obvious reasons).
I think there is a line between "protecting free speech" and being complicit in crime. This line has been crossed by Telegram.
I use it a lot, and I run some large groups on it. I don't see any of that stuff, I've never gone looking for it, and I'm not even sure how to look for it. Can you tell me some examples of what to search for to see what you're talking about?
it's not specific to anything but humans, which are ridden with vile stuff.
just turn off any discovery and suggestion features
> Censorship is when a third party uses coercion to force admins to submit to them and remove posts against their will
What a weird hill to die on, given the whole context of this situation.
Do you see public recruitment of people into terrorist cells as a freedom of speech? Do you see publicly selling drugs as a freedom of speech? It isn't about censorship at all, it's about actual *illegal* activity.
Now it's up to Durov and his lawyers to prove that Telegram actually dealt with that. So far France doesn't seem convinced.
Terrorist recruitment and selling drugs is conduct, and whoever engaging in that illegal conduct can, and should, be prosecuted.
The problem I have is with requiring the chat service to police that or making its operators liable for the illegal conduct of its users.
It shouldn't be up to Durov to prove he did or didn't do anything, it's up to France to prove that he or his company actively participated such conduct. And no, people using the service to engage in the illegal acts isn't nearly enough, any more than Google's CEO should be liable for a drug dealer using Maps to navigate to the drug deal location, or Venmo should be liable for the buyer paying the seller with it.
The reason it's worth defending this "hill" is because allowing governments to use censorship as a convenient means of solving these problems always leads to more control and restrictions that infringe on the legitimate rights of everyone.
I understand the appeal of these tactics. Since we know that terrorist groups operating abroad will use chat services to incite locals to commit violence, it's tempting to search the chat service and stop that from happening by censoring the communication, preventing the radicalization. Since we know that drug sellers organize the sale of the contraband using the chat app, it's tempting to search the chat app and censor that speech, thus preventing the buyer from learning where to meet the seller. Or wait for enough speech to cross the line into conduct and then arrest them for it. Sounds great. If it would work, I'd support it.
The problem is that it won't work, and the only way to "fix it" will be to push more and more and more surveillance and control. It's already being pushed. Look at this chat control nonsense. Do you support that?
So what I'm saying, is let's just recognize that it's a basic human right for people to communicate freely and that operators of communication services shouldn't be held liable for the actions of their users.
Yes but let's also be clear that some forms of speech censorship are widely and broadly supported in public, 'town square' or broadcast media situations. Things like child porn, personal threats, calling for or organizing violence, hate speech, etc. Laws and social acceptance of this kind of censorship, of course, differ in different regions.
Hacker news may 'moderate' illegal content on this website, but they don't have a choice in the matter, US or State authorities will shut them down if they do not, so it's technically censorship. Your view on whether this is good or bad will depend on many factors, one of which may be how you view the legal structure of your government, which is substantially different in France, the US, or Dubai (where Telegram is located).
As is mentioned in the article, Telegram is not simple a 'secure messaging app'. They are also serving a role similar to Facebook, Twitter, Instagram, or TikTok. They host publicly accessible channels or public group chats with thousands of members, which are all (apparently) unencrypted and accessible to the Telegram company. It may be reasonable (both legally and socially) to expect that a company which has knowledge of public, illegal speech to take steps to remove that content from their platform.
And Durov, by choosing to be a media company and not E2E encrypt all of his user's private communications, has walked right into a situation where he needs to abide by local laws moderating/censoring illegal content, everywhere.
> Moderation is what happens here on HN: Admins have some policies to keep the conversation on track, users voluntarily submit to them.
What do you mean by users voluntarily submitting to these policies? This distinction seems key in your argument, but I don't see what alternatives to submitting I have here, making it involuntary, right?
No, you miss the point.
If HN decided to ban all posts about Donald Trump that is moderation. Users voluntarily submit to this policy by participating in the site, and if they do not, they will be banned.
If the State of California required that all web sites run from their state are REQUIRED to ban all posts about Donald Trump, that is censorship.
Moderation is "your house, your rules" while censorship is someone else imposing their rules in your house.
Do you see what I'm saying? When France is talking about "moderation" of Telegram, what they actually mean is censorship.
2 replies →
is removal of CSAM moderation or censorship?
It depends on whether the parties to the communication want that or not.
So let's say a few child molesters create a chat service and use it to send the worst, most horrible child pornography amongst themselves. Removing it is censorship, not moderation.
Look, I'm not trying argue for legalization of child pornography here. That is illegal contraband, full stop. The intent of my comment is to say "let's just call it what it is"
I think the overwhelming consensus is that child pornography is so horrible that mere possession of it must be CENSORED.
I'm not arguing that censorship is always wrong. For instance, I don't want to see public billboards of graphic sex or violence. I think it's good that we censor that, so that we aren't forced to look at things like that when we don't want to.
What is bothering me is that proponents of censorship, and especially certain proponents of it who want to use it as a tool to suppress ideas they don't like, have recently started using the word "moderation" in order to sneak their plans into policy without raising objections. The reason is because when we hear the word "censorship" we immediately think, "Whoa, hold on there, censorship is very harsh, let's take a hard look and make sure this is serious enough that resorting to censorship is justified and appropriate", whereas when we hear the word "moderation" we think, "Of course, we all appreciate someone deleting the spam and trolls who annoy us", and we're less likely to think critically about exactly what kind of expression is being legally prohibited.
[dead]
[dead]
[dead]
[flagged]
The author claims that everyone refers to Telegram as an encrypted messenger, but he only provides a single example to support that. I quickly checked Google News and couldn't find any media on the first page that did the same. It feels like a manipulation.
UPDATE: anyone who downvote, I invite to check for themselves.
Just a few known media:
1. https://www.aljazeera.com/amp/news/2024/8/25/telegram-messag...
2. https://www.washingtonpost.com/technology/2024/08/25/durov-t...
3. https://www.businessinsider.com/telegram-ceo-pavel-durov-arr...
4. https://www.theguardian.com/media/article/2024/aug/24/telegr...
However, indeed, I‘ve seen a few media that call it encrypted. This include France24, POLITICO, and The Times.
Just today, every French newspaper and hundreds around the world. Two examples:
https://www.thetimes.com/world/europe/article/pavel-durov-te... “Chief executive of the encrypted messaging app reportedly detained at an airport near Paris over alleged failure to stop criminal activity on the platform”
https://www.tf1info.fr/high-tech/telegram-qui-est-pavel-duro... (one of the largest French newspaper) “Qui est Pavel Durov, le fondateur de la messagerie cryptée Telegram arrêté samedi en France ?”
It’s called handpicking
Subjectively and qualitatively, roughly half of all news articles on Telegram I read contain the word "encrypted" or at least "secure" somewhere.
Lol
https://www.google.com/search?q=Telegram+"encrypted+messagin...
Perhaps the French authorities have some taste in UI/UX. They're going to keep him in jail until telegram is no longer painful to use.
There's a long list of things I dislike about Telegram, but UI/UX is really not on it.