170
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 30 Aug 2024
170 points (80.6% liked)
Privacy
32165 readers
309 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
It's a street, not a changing booth. Also, I'm familiar with every charge against Durov and I personally have seen the illegal content I talked about. If it's so easily accessible to the public and persists for years, it has nothing to do with privacy and there is no moderation - though his words also underscore the latter.
Who said it's a street? What makes it a street?
Did you seek it out? I and nobody I know personally, have ever encountered anything like what was described on that platform and I've been on it for years.
Was it the same "channel" or "group chat" that persisted for years?
What gives them the right or responsibility to moderate a group chat or channel more than say Signal or Threema? Just because their technical back end lets them?
I mean by that argument Signal could do client side scanning on everything (that's an enforcement at the platform level that fits their technical limitations). Is that where we're at? "If you can figure out how to violate privacy in the name of looking for illegal content, you should."
Nothing Telegram offers is equivalent to the algorithmic feeds that require moderation like YouTube, Twitter, Instagram, or Facebook, everything you have to seek out.
Make no mistake, I'm not defending the content. The people who used the platform to share that content should be arrested. However, I'm not sure I agree with the moral dichotomy we've gotten ourselves into where e.g., the messenger is legally responsible for refusing service to people doing illegal activity.
I won't go into the specific channels as to not promote them or what they do but we can talk about one known example, which is how Bellingcat got to the FSB officers responsible for the poisoning of Navalny via their mobile phone call logs and airline ticket data. They used the two highly popular bots called H****a and the E** ** G**, which allow to get everything known to the government and other social networks on every citizen of Russia for about $1 to $5. They use the Telegram API and have been there for years. How do you moderate that? You don't. You take it down as the illegal, privacy-violating, and doxing-enabling content that it is.
Edit: "Censored" the names of the bots, as I still don't want to make them even easier to find.
Was that a bad thing? I've never heard the name Bellingcat before, but it sounds like this would've been partially responsible for the reporting about the Navalny poisoning?
Ultimately, that sounds like an issue the Russian government needs to fix. Telegram bots are also trivial to launch and duplicate so ... actually detecting and shutting that down without it being a massive expensive money pit is difficult.
It's easy to say "oh they're hosting it, they should just take it down."
https://www.washingtonpost.com/politics/2018/10/16/postal-service-preferred-shipper-drug-dealers/
Should the US federal government hold themselves liable for delivering illegal drugs via their own postal service? I mean there's serious nuance in what's reasonable liability for a carrier ... and personally holding the CEO criminally liable is a pretty extreme instance of that.
Telegram is in the news often for public groups with lots of crime
"The news" is too vague a source to dispute.
Signal can very clearly see all the messages you send if they just add a bit of code.
But with that small tweak to their front end they can "VERY CLEARLY SEE that the platform is being misused." So per your own argument, the government should force them to do so (and presumably anyone that's uncomfortable with that can "just not use Signal").
🙄