this post was submitted on 02 Sep 2024
29 points (93.9% liked)

Privacy

31991 readers
842 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

I don’t want to see PGP rejection based on usability. So, to level the field at user level we take Delta Chat, which uses PGP. If I understand that correctly.

I have no knowledge of telegram security at all.

you are viewing a single comment's thread
view the rest of the comments
[–] TCB13@lemmy.world 2 points 2 months ago (1 children)

And what about signal? If some gov founds a group chat they don’t like, will they take it down? How will they even know if all the contente is encrypted?

CSAM? More like copyright infringement. CSAM is the usual cheap excuse to shut down everything because of the obvious social implications.

[–] pupbiru@aussie.zone 2 points 2 months ago (1 children)

if a govt seizes a device and discovers channel IDs to be taken down, i’m sure than signal would do so - there have been no arrest warrants, after all… however, the problem is also significantly smaller for signal because signal can’t have enormous broadcast groups

it’s kinda irrelevant what it is - you have to comply with police orders to moderate your platform… if this were musk and x lemmy would be cheering on the arrest! no matter who you are, you ~~don’t~~ shouldn’t get to just break the law

and you’re right CSAM is frequently used as an excuse, and no i don’t have evidence - that would require actually looking for said content, which i have no inclination to do. the only information i have is that multiple independent news outlets have referenced telegram for years - not proof, but a more convincing argument than simply denial - because let’s not kid ourselves, unless you’ve gone looking for that content, you’ve got no proof against it either (and even if you didn’t find it, that’s no guarantee either - it’s unlikely easy to find)

[–] TCB13@lemmy.world 1 points 2 months ago* (last edited 2 months ago) (1 children)

you have to comply with police orders to moderate your platform…

Your points are fair however, where does it stop? If the police says "make it all plaintext" then what happens? It is a police request after all.

This thing where chat platforms and others "need" to comply with police / govt orders and remove content is very tricky... should platforms really censor everything the govts ask for? What if it is a group chat about a corrupt political party in power (with proof)? The govt will say it is CSAM, them Signal will shut it down and our democracies are gone.

To make it really clear: I'm not for breaking the law, and I don't think that content should be on such platforms. The problem is that once you start removing that content the precedent will be abused to remove other actually important stuff because "it is CSAM" and the E2EE doesn't have ways to check if is is really CSAM nor should it be the judge of the content.

[–] pupbiru@aussie.zone 1 points 2 months ago

this is the slippery slope fallacy… “where does it stop” is not a valid argument to not start