235
submitted 6 months ago by clot27@lemm.ee to c/privacy@lemmy.ml

Here's what he said in a post on his telegram channel:

🤫 A story shared by Jack Dorsey, the founder of Twitter, uncovered that the current leaders of Signal, an allegedly “secure” messaging app, are activists used by the US state department for regime change abroad 🥷

🥸 The US government spent $3M to build Signal’s encryption, and today the exact same encryption is implemented in WhatsApp, Facebook Messenger, Google Messages and even Skype. It looks almost as if big tech in the US is not allowed to build its own encryption protocols that would be independent of government interference 🐕‍🦺

🕵️‍♂️ An alarming number of important people I’ve spoken to remarked that their “private” Signal messages had been exploited against them in US courts or media. But whenever somebody raises doubt about their encryption, Signal’s typical response is “we are open source so anyone can verify that everything is all right”. That, however, is a trick 🤡

🕵️‍♂️ Unlike Telegram, Signal doesn’t allow researchers to make sure that their GitHub code is the same code that is used in the Signal app run on users’ iPhones. Signal refused to add reproducible builds for iOS, closing a GitHub request from the community. And WhatsApp doesn’t even publish the code of its apps, so all their talk about “privacy” is an even more obvious circus trick 💤

🛡 Telegram is the only massively popular messaging service that allows everyone to make sure that all of its apps indeed use the same open source code that is published on Github. For the past ten years, Telegram Secret Chats have remained the only popular method of communication that is verifiably private 💪

Original post: https://t.me/durov/274

you are viewing a single comment's thread
view the rest of the comments
[-] fushuan@lemm.ee 2 points 6 months ago

Tegram stores all the conversation in their servers, since you don't need to be connected in the phone or have the phone witchednon if you want to chat in the pc, or in another phone. This means that the authority is the server. WhatsApp it's not like that, if you delete a shared photo after a while it will be cached out and you will lost access to it, meaning that they don't store that stuff. The same thing happens with WhatsApp desktop or web, they stay in an infinite loading icon until you twitch on the phone or sometimes even unlock it.

This means that whatever telegram develops must not only keep the group chat encrypted in the server, but any valid client of a user must be able to decipher the content, so every client must somehow have the key to unlock the content. One way of doing it would be for every client of a single user to generate keys (which I'm sure they already do) and reform a key exchange between them, to share that way a single shared key, which is what identifies your account. Then toy could use that shared key to decipher the group chat shared key which telegram can store on their server or do whatever is done in those cases, I'm not that well versed.

The problem here lies in what happens when you delete and/or logout of all the accounts, currently you can login into the server again, because telegram has all the info required, but if they store the "shared key" then it's all moot, I guess they could store a user identifying key pair, with the private key encrypted with a password, so that it can be accessed from wherever. They should as always offer MFA and passkey alternatives to be able to identify as yourself every time you want to log into a new client, without requiring the password and so on.

This is some roughly designed idea I just had that should theoretically work, but I'm sure that there's more elegant ways to go about this.

It's work for sure to implement all of this in a secure way, provided that you have to somehow merge everything that already exists into the new encryption model, make everyone create a password and yada yada while making sure that it's as seamless as possible for users. However, I feel like it's been quite a while and that if they did not do it already, theybjist won't, we either trust them with our data or search for an alternative, and sadly there's no alternative that has all the fuzz right now.

[-] rdri@lemmy.world 1 points 6 months ago

Sorry I have a hard time understanding the gist of your text. I don't think it's viable to be upset about what happens with access that was already acquired previously because that very fact already poses a bigger threat (which might have more to do with the nature of conversations vs how the platform works).

[-] fushuan@lemm.ee 1 points 6 months ago* (last edited 6 months ago)

I wasn't talking about situations with compromised accounts, I was talking about legitimate accounts that were created in a typical way being converted to a zero knowledge encryption method, I was aknowledging that it's hard doing that conversion when a user might have several clients logged on (2 phones, 6 computers...).

My point was that if they have not put any motivation in the transition, they never will because the bigger the userbase, the harder for them to manage the transition. Also, I find that sad because they should have invested more effort in that instead of all the features we are getting, but whatever.

If you found the technical terms confusing, public/private keys are some sort of asymmetric "passwords" used in cryptography that secure messages, and shared keys would be symmetrical passwords. The theory between key exchanges and all around those protocols are taught in introductory courses to cryptography in bachelors and masters, and I'm sorry to say that I don't have the energy to explain more but feel free to read about the terms if you feel like it.

If you however found it confusing because I write like crap, I'm sorry for potentially offending you with the above paragraph and I'll blame my phone keyboard about it :)

[-] rdri@lemmy.world 0 points 6 months ago* (last edited 6 months ago)

No that's not what I didn't understand. The problem itself as you described it seems either a non-issue or something very few people (who's already using telegram for some time) would care about. I don't understand the scenario that would pose a problem for the user. The moment some account legitimately gains access to some chat is probably what should trouble you instead.

this post was submitted on 08 May 2024
235 points (80.4% liked)

Privacy

32130 readers
1058 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS