this post was submitted on 05 Jan 2025
790 points (97.4% liked)
Privacy
32650 readers
394 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
TLDR edit: I'm supporting the above comment - ie. i do not support apple's actions in this case.
It's definitely good for people to learn a bit about homomorphic computing, and let's give some credit to apple for investing in this area of technology.
That said:
Encryption in the majority of cases doesn't actually buy absolute privacy or security, it buys time - see NIST's criteria of ≥30 years for AES. It will almost certainly be crackable either by weakening or other advances.. How many people are truly able to give genuine informed consent in that context?
Encrypting something doesn't always work out as planned, see example:
"DON'T WORRY BRO, ITS TOTALLY SAFE, IT'S ENCRYPTED!!"
Source
Yes Apple is surely capable enough to avoid simple, documented, mistakes such as above, but it's also quite likely some mistake will be made. And we note, apple are also extremely likely capable of engineering leaks and concealing it or making it appear accidental (or even if truly accidental, leveraging it later on).
Whether they'd take the risk, whether their (un)official internal policy would support or reject that is ofc for the realm of speculation.
That they'd have the technical capability to do so isn't at all unlikely. Same goes for a capable entity with access to apple infrastructure.
How hard is it to grasp that I don't want Apple doing anything in my cellphone I didn't explicitely consent to?
I don't care what technology they develop, or whether they're capable of applying it correctly: the point is, I don't want it on my phone in the first place, anymore than I want them to setup camp in my living room to take notes on what I'm doing in my house.
My phone, my property, and Apple - or anybody else - is not welcome on my property.
Sorry for my poor phrasing, perhaps re-read my post? i'm entirely supporting your argument. Perhaps your main point aligns most with my #3? It could be argued they've already begun from a position of probable bad faith by taking this data from users in the first place.
Oh yeah I kinda missed your last point. Sorry 🙂