this post was submitted on 16 Sep 2024
58 points (96.8% liked)
Privacy
31991 readers
884 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
Chat rooms
-
[Matrix/Element]Dead
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The same way you would do it with a black box while optionally taking as many shortcuts as one is comfortable with by virtue of assuming having a better understanding of it's been built?
Get it audited by tools, e.g OneSpin, or people, e.g Bunnie, that one trusts?
I'm not saying it's intrinsically safer than other architectures but it is at least more inspectable and, for people who do value trust for whatever, can be again federated.
I assume if you do ask the question you are skeptical about it so curious to know what you believe is a better alternative and why.
I mean can't they just audit a version that doesn't have a backdoor/snoops. Verifying against silicon is probably very hard.
I imagine it's like everything else, you can only realistically verify against a random sample. It's like trucks passing a border, they should ALL be checked but in practice only few gets checked and punished with the hope that punishment will deter others.
Here if 1 chip is checked for 1 million produced and there is a single problem with it, being a backdoor or "just" a security flaw that is NOT present due to the original design, then the trust in the company producing them is shattered. Nobody who can afford alternatives will want to work with them.
I imagine in a lot of situations the economical risk is not worth it. Even if say a state actor does commission a backdoor to be added and thus tell the producing company they'll cover their losses, as soon as the news is out nobody will even use the chips so even for a state actor it doesn't work.
Thats true, but that sadly won't help against a state forcing a company to put these things into the silicon. Not saying they do rn, but its a real possibility.