this post was submitted on 07 Apr 2022
25 points (100.0% liked)

Technology

34928 readers
74 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

I'm uncertain if deepfake porn should be banned; it's already difficult to enforce copyright on the internet, because it'll just get redistributed.

Similarly, it would be impossible to enforce restrictions on deepfake porn; because it can easily be redistributed.

you are viewing a single comment's thread
view the rest of the comments
[–] Amicchan@lemmy.ml 1 points 2 years ago* (last edited 2 years ago) (1 children)

It’s also sexual abuse, on the same level of taking a nude picture without consent and posting it publicly.

I wouldn't assume that the source image was taken illegally; it could be a picture that the person consented. (e.g A married couple consenting to getting a photo shoot together.)

What if the "victim" was accidentally framed in the picture?


Also, there could be a danger of someone falsely claiming that a video involves a deepfake of a victim.

[–] AgreeableLandscape@lemmy.ml 2 points 2 years ago (1 children)

I wouldn’t assume that the source image was taken illegally; it could be a picture that the person consented

Yes, but unless they consented specifically to having porn made from those images, that's still highly problematic.

[–] electrodynamica@mander.xyz 0 points 2 years ago

But only because that's where technology is today. You need many images of the face at many angles, and it is only a face swap of real video that was captured.

Within 50 years it will be possible to create complete fakes from whole cloth. No images, no video as a source. Only a person's imagination. At that point it is equivalent of cartoons.