this post was submitted on 06 Jan 2025
121 points (100.0% liked)

Technology

37826 readers
1593 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] wyrmroot@programming.dev 33 points 2 days ago (3 children)

Homomorphic encryption, which allows for analyzing secret data without a decryption step, is actually incredibly cool. It’s a shame the conversation will begin with the fact that they deployed the feature as on by default.

[–] ClassifiedPancake@discuss.tchncs.de 25 points 2 days ago (1 children)

And it’s right that this is the conversation because Apple needs to learn people want to be in control and these things need to be opt-in. They can build the most sophisticated fancy system to protect your privacy, if it’s sending your stuff to another server it needs to ask for permission full stop.

[–] tempest@lemmy.ca 12 points 1 day ago

They (and every other tech company) have been doing this type of thing for nearly 20 years. You might see some whinging about it in some corners of the Internet, like here, but most people don't know or don't give a shit.

It sucks.

[–] t3rmit3@beehaw.org 9 points 1 day ago (1 children)

It allows processing data without decrypting it, which is great in terms of preventing someone else from snooping on it, but doesn't change that Apple is retaining the ability to analyze the data content, which is the actual issue here.

[–] scrubbles@poptalk.scrubbles.tech 5 points 1 day ago (1 children)

Reading between the lines, I guarantee they're doing the same thing for CSAM protection. I think sex offenders caused this to happen, I believe they found out that they were using photos to host that horrid stuff, and apple can't just ignore it, so I think we have them to thank

[–] t3rmit3@beehaw.org 4 points 1 day ago (2 children)

I would be interested to see what lines you read between, because "identifying landmarks and points of interest" doesn't sound like anything capable of identifying CSAM. I think you're giving a big corporation a bunch of credit there is no reason to suspect it is owed, for an excuse they never professed.

[–] Redjard@lemmy.dbzer0.com 2 points 1 day ago

They did this exact thing for csam detection a while back, and were made to stop due to public outcry.
It might have been analyzed locally and before encryption then though, still however without consent of the user and sending problematic results to apple.

It is very realistic that here they would make the device decrypt and check the description against a database and make it send the file and description off for reporting when a match is found.

[–] scrubbles@poptalk.scrubbles.tech 2 points 1 day ago* (last edited 1 day ago) (1 children)

Apple killed it's last version in August 2023 because it didn't respect privacy. Where there's object detection there's csam detection. Which hey I think is good, and I wouldn't expect an announcement about it. I just see how they did this, and this is exactly how I'd roll out a privacy focused csam detector if I was going to do it

From August 2023, they killed the non privacy focused one: https://www.wired.com/story/apple-csam-scanning-heat-initiative-letter/

[–] t3rmit3@beehaw.org 4 points 1 day ago* (last edited 1 day ago)

Where there’s object detection there’s csam detection.

This is not true at all. A model has to be trained to detect specific things. It does not automatically inherit the ability to detect CSAM just because it can detect other objects. The method it previously used for CSAM image detection (perceptual hashing) was killed for bad privacy implementation, and the article specifically notes that

Tsai argues Apple's approach is even less private than its abandoned CSAM scanning plan "because it applies to non-iCloud photos and uploads information about all photos, not just ones with suspicious neural hashes."

So even images that the local detection model doesn't match to CSAM would be being uploaded to their servers.

Apple killed it’s last version in August 2023 because it didn’t respect privacy.

It was also not that good.

[–] teletext@reddthat.com 7 points 2 days ago (2 children)

Homomorphic encryption is the same thing as backdoored encryption.

[–] wyrmroot@programming.dev 21 points 2 days ago* (last edited 2 days ago) (2 children)

This is not the case, but I do still disagree with the “trust me bro” approach to a feature rollout that does send data your somewhere, encrypted or not.

Edit: For those interested, the reason it's not the same as a backdoor is that the result of the computation done on HE data is itself still encrypted and readable only by the original owner. So you can effectively offload the work of a certain analysis to a server that you don't actually trust with your keys.

[–] p03locke@lemmy.dbzer0.com 2 points 1 day ago

readable only by the original owner

Right now it's not. All encryption gets its back broken by security flaws and brute force mathematics.

[–] t3rmit3@beehaw.org 1 points 1 day ago

For those interested, the reason it’s not the same as a backdoor is that the result of the computation done on HE data is itself still encrypted and readable only by the original owner. So you can effectively offload the work of a certain analysis to a server that you don’t actually trust with your keys.

Do iPhones have a BYOK system for people to supply their own keypairs? Or is their OS open-source so that people can see how the keys are being handled? Because if not, it just sounds like all it takes to break this is for Apple's OS that it controls to ship the private keys that it generated up to its servers?

[–] Revan343@lemmy.ca 14 points 2 days ago (1 children)

Just say you don't understand encryption

[–] teletext@reddthat.com 3 points 2 days ago

Sure, bro. Whatever makes you happy.