← Back to context

Comment by innagadadavida

1 year ago

I am amazed at the low quality comments here. Encryption really doesn’t matter as much as the trust of the app here. Any malicious app author can 100% secure encrypt everything in wire and yet leak 100% of your data to some state actor. Anything you type into the chat box is only encrypted by the app after you type and probably storing it in the clear in some local SQLite db. It gives them a whole bunch of options to mess with that plain text data. Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.

>Encryption really doesn’t matter as much as the trust of the app here. Any malicious app author can 100% secure encrypt everything in wire and yet leak 100% of your data to some state actor.

This is exactly the problem with Telegram. Telegram defaults to client-server encryption for everything, and you can't enable end-to-end encryption for anything on desktop, or group chats ever. Only 1:1 chats and calls on mobile have end-to-end encryption. Client-server encryption is exactly the "100% secure encrypt in wire". When that data arrives to the server, it's no longer encrypted, and Telegram can do whatever it wants with that data, including leaking it to some state actor (like FSB/SVR).

>Anything you type into the chat box is only encrypted by the app after you type and probably storing it in the clear in some local SQLite db.

If endpoint security is of concern, your options with networked TCBs are quite limited. Are you sure the malware doesn't have a chance to escalate its privileges and read messages in clear from RAM?

>It gives them a whole bunch of options to mess with that plain text data.

I'm looking forward to hearing about how you managed to fix this. Should we implement memory as eFuses (https://en.wikipedia.org/wiki/EFuse) to prevent editing logs? What if the user wants to delete his messages?

>Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.

E.g. with Signal android, you can pull off the APK from the device, and compare its hash against the client that was reproducibly built from the source code you have in your possession. Been there done that https://imgur.com/a/wXYVuWG

>I am amazed at the low quality comments here.

Too bad you're not exactly improving them with your nonsense.

> malicious app author can 100% secure encrypt everything in wire and yet leak 100% of your data

Um, surely you understand the difference between piping random-looking bytes uselessly to whoever and having a readable copy of all data readily available to whoever hacks the system or applies for a sysadmin role? Or are you making the assumption that people use a closed-source client and the server can push malicious code?

> Even if the app source code is published as you don’t know if they backdoored it before they submitted to App Store.

Doesn't work if you have third parties also working with the system or forking the code to work with it. It gets noticed. Your concept of "e2ee can be 100% leaked anyway" only works if you don't know what code you're running. You need to trust the community in general to uncover issues you've overlooked (in the code or build process) but that's not the same as not having encryption at all. You can't audit the servers but you can audit the client code.

  • > You need to trust the community in general to uncover issues

    My point is that this community could just be your friendly CIA operatives running the show with a veneer of open source. Also this “community” has no liability unlike the closed platform companies.

Telegram basically have "trust me bro" security.

Even worse than Apple. They at least have some e2ee options.