this post was submitted on 17 Jul 2023
163 points (98.8% liked)

Technology

58150 readers
4349 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

According to Wikipedia:

The goal of the C2PA is to define and establish an open, royalty-free industry standard that allows reliable statements about the provenance of digital content, such as its technical origin, its editing history or the identity of the publisher.

Has anyone explored this standard before? I'm curious about privacy implications, whether it's a truly open standard, whether this will become mandatory (by law or because browsers refuse to display untagged images), and if they plan on preventing people from reverse engineering their camera to learn how to tag AI-generated photos as if they were real.

top 31 comments
sorted by: hot top controversial new old
[–] housepanther@lemmy.goblackcat.com 65 points 1 year ago (4 children)

If their proposed solution is not completely open sourced and the source visible to everyone, then I have no interest in it.

[–] ZickZack@kbin.social 23 points 1 year ago

They will make it open source, just tremendously complicated and expensive to comply with.
In general, if you see a group proposing regulations, it's usually to cement their own positions: e.g. openai is a frontrunner in ML for the masses, but doesn't really have a technical edge against anyone else, therefore they run to congress to "please regulate us".
Regulatory compliance is always expensive and difficult, which means it favors people that already have money and systems running right now.

There are so many ways this can be broken in intentional or unintentional ways. It's also a great way to detect possible e.g. government critics to shut them down (e.g. if you are Chinese and everything is uniquely tagged to you: would you write about Tiananmen square?), or to get monopolies on (dis)information.
This is not literally trying to force everyone to get a license for producing creative or factual work but it's very close since you can easily discriminate against any creative or factual sources you find unwanted.

In short, even if this is an absolutely flawless, perfect implementation of what they want to do, it will have catastrophic consequences.

[–] warmaster@lemmy.world 20 points 1 year ago* (last edited 1 year ago) (1 children)

I bet it won't. And I bet the implementations of all 3 are out of spec, and send your shit to everyone else that will buy it.

[–] housepanther@lemmy.goblackcat.com 7 points 1 year ago (1 children)

That's a good bet! That means I'll have zero interest in it and will not use it.

[–] warmaster@lemmy.world 5 points 1 year ago (1 children)

I got fed up with MS bullshit and moved to Linux. Replaced illustrator with Inkscape and Photoshop with Photopea until I can learn how to use GIMP's unintuitive UI.

[–] housepanther@lemmy.goblackcat.com 2 points 1 year ago (1 children)

There is a learning curve with GIMP. Once you get past it, GIMP is great. It does about 90-95% of what Photoshop will do and that's good enough for me. I'm fully on Linux as well. I run Arch and swear by it. I also like Open and Free BSD.

[–] VelvetStorm@lemmy.world 1 points 1 year ago

I'm an amateur macro photographer and I love taking photos and doing light tweeking to them to make them more presentable for your average person but I am definitely not going to spend the required hours upon hours to learn to do the simplest things in gimp and dark table that i can learn PS and LR in a 10 min video or less.

That being said I also refuse to pay a god dam subscription fee for something I used to own outright 20 years ago especially considering it can't even stack or slab photos even 10% as good as zerene or helicon.

[–] AbidanYre@lemmy.world 4 points 1 year ago

And even then I wouldn't trust those companies.

[–] barryamelton@lemmy.ml 1 points 1 year ago (1 children)

It will not matter I'd it is open source but it is backed into the HW. You will be their removed anyways with no way to change it.

[–] tables@kbin.social 9 points 1 year ago

You will be their removed anyways with no way to change it.

Did you type removed or does some system in the fediverse automatically censor words?

[–] CriticalMiss@lemmy.world 16 points 1 year ago (1 children)

Fingerprinting is about to take a step forward

[–] eth0p@iusearchlinux.fyi 12 points 1 year ago* (last edited 1 year ago)

I glossed through some of the specifications, and it appears to be voluntary. In a way, it's similar to signing git commits: you create an image and chose to give provenance to (sign) it. If someone else edits the image, they can choose to keep the record going by signing the change with their identity. Different images can also be combined, and that would be noted down and signed as well.

So, suppose I see some image that claims to be an advertisement for "the world's cheapest car", a literal rectangle of sheet metal and wooden wheels. I could then inspect the image to try and figure out if that's a legitimate product by BestCars Ltd, or if someone was trolling/memeing. It turns out that the image was signed by LegitimateAdCompany, Inc and combined signed assets from BestCars, Ltd and StockPhotos, LLC. Seeing that all of those are legitimate businesses, the chain of provenance isn't broken, and BestCars being known to work with LegitimateAdCompany, I can be fairly confident that it's not a meme photo.

Now, with that being said...

It doesn't preclude scummy camera or phone manufacturers from generating identities unique their customers and/or hardware and signing photos without the user's consent. Thankfully, at least, it seems like you can just strip away all the provenance data by copy-pasting the raw pixel data into a new image using a program that doesn't support it (Paint?).

All bets are off if you publish or upload the photo first, though—a perceptual hash lookup could just link the image back to original one that does contain provenance data.

[–] ramble81@lemmy.world 13 points 1 year ago (2 children)

I know Blockchain is always in search of a solution, but is this one place where it may work? Take a hash of the image and store that hash in a chain, that way you can always hash the image and see if it's been altered?

[–] zaplachi@lemmy.ca 13 points 1 year ago

Many CSAM detecting services already use image hashing to compare to a central database.

https://www.thorn.org/blog/hashing-detect-child-sex-abuse-imagery/

[–] xep@kbin.social 6 points 1 year ago (1 children)

What value does having a blockchain here provide, exactly?

[–] ramble81@lemmy.world 3 points 1 year ago (1 children)

Publicly traceable and verifiable hashes of the images authenticity. Submitting a hash of the image can prove who submitted it and when and then any altering of the image would yield a different hash which you would know you're not looking at the original image.

[–] xep@kbin.social 3 points 1 year ago

How would that functionally differ from having an authority verify these hashes? Certificate authorities already provide a similar service, and C2PA would likely work in a similar way, sans any effort to implement "trustlessness."

[–] Nobilmantis@feddit.it 6 points 1 year ago

Shit smells like Google's browser add-on Google tells you to install if you want to opt-out of Google's tracking. Nice.

[–] warmaster@lemmy.world 5 points 1 year ago (1 children)

Cool, now I need to buy an AMD CPU.

[–] ilikecoffee@lemmy.world 3 points 1 year ago

Probably won't help you all that much anyway..

[–] foggy@lemmy.world 4 points 1 year ago

This will de-anonymize memes.

[–] techLover@lemmy.world 2 points 1 year ago (1 children)
[–] ddkman@lemmy.world 2 points 1 year ago

https://odysee.com/@fireship:6/the-future-of-truth-on-the-internet:7

This video is also available on odysee if you don't want shitty ads and your data stolen

[–] Osa-Eris-Xero512@kbin.social 1 points 1 year ago (1 children)

What I don't understand is why having every smartphone or DSLR sign every image captured couldn't solve this problem better and faster than something like this.

[–] eth0p@iusearchlinux.fyi 5 points 1 year ago

From what I can tell, that's basically what this is trying to do. Some company can sign a source image, then other companies can sign the changes made to the image. You can see that the image was created by so-and-so and then manipulated by so-and-other-so, and if you trust them both, you can trust the authenticity of the image.

It's basically git commit signing for images, but with the exclusionary characteristics of certificate signing (for their proposed trust model, at least. It could be used more like PGP, too).

[–] barryamelton@lemmy.ml 1 points 1 year ago (2 children)

I typed b i t c h. It sucks that it got censored. It maybe depends on the community mods (I hope) or the instance..

[–] quaddo@kbin.social 3 points 1 year ago (1 children)

looks around But you said that, right? Looked them right in their eyeholes and said it?

[–] dismalnow@kbin.social 1 points 1 year ago

I am a simple man. I see Key and Peele and I say 𝓫𝓲𝓲𝓲𝓲𝓲𝓲𝓲𝓲𝓲𝓲𝓲𝓲𝓲𝓲𝓲𝓲𝓽𝓬𝓱

[–] Voyajer@kbin.social 1 points 1 year ago

Test, bitch.

[–] monk@lemmy.unboiled.info 1 points 1 year ago

And I want world peace and a unicorn.

I mean, who wins, digital fingerprinting or r/faxofafax?

[–] platysalty@kbin.social 0 points 1 year ago

I like the idea. I don't like the logos involved with the idea.

load more comments
view more: next ›