this post was submitted on 04 Sep 2023
144 points (97.4% liked)
Technology
59135 readers
6622 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Man, I remember the controversy when this initiative launched. Can't please anyone, it seems.
I never supported since it was on device and given this is the US hashes to spot "extremism could be added" given apple doesn't know what the hashes are.
No you're wrong.
They are not cryptographic hashes. They are "perceptual" hashes or "fuzzy" hashes. They're basically just a low resolution copy of the original image. It's trivial for an attacker to maliciously send innocent seeming images that are a hash collision. This is, by the way, a feature not a bug. Perceptual hashes are not designed to perform a perfect match.
There are plenty of free white-papers on how perceptual hashes work, and Facebook's implementation is even open source.
Apple said they tested 100 million perfectly legal images and three had collisions with a CSAM perceptual hash. When you consider how many photos Apple was proposing to scan (hundreds of trillions of photos) that means thousands of false positives would have occurred even if nobody maliciously abused the system.
And because of all that - Apple was planning to do human reviews of every photo. They would, therefore, have seen every match (and every false positive). It couldn't have been hidden from Apple.
What makes you day apple didn't know what they are? Is this a thing that happened that I'm not aware of?
If they only get the hashes supplied, Apple can't tell why they're bad files.
Nobody cared it was running on iCloud. People cared it was going to be running on their phones, scanning literally everything they had.
The consumer is not at fault for believing their personal data on their own hardrive, in the phone they paid for, should not be seen by anyone but themselves if they do not choose it to be.
It's not the consumers fault for believing this to be the case given this is how computer technology always worked.
Their only fault is for using Apple, when Apple has gone to extreme lengths to blur the line between what is your and what is theres, and effectively makes it impossible to keep things on your phone only on your phone unless you opt out of iCloud entirely. iCloud is so integrated, it's not clear to the user that everything on the phone is also on the cloud, and therefore not private.