this post was submitted on 31 Jan 2024
40 points (91.7% liked)

Technology

58164 readers
3756 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Taylor Swift AI images prompt US bill to tackle nonconsensual, sexual deepfakes::Bipartisan measure introduced in the US senate will allow victims in ‘digital forgeries’ to seek civil penalty against perpetrators

you are viewing a single comment's thread
view the rest of the comments
[–] TheGrandNagus@lemmy.world 8 points 7 months ago* (last edited 7 months ago)

Would they?

Yes.

People break copyright and other IP laws all the time, for example.

Shit, torrenting a film carries a 10 year max prison sentence where I am. It doesn't stop anybody.

Speeding fines can be absolutely huge. People still speed. Etc.

A law like this is virtually impossible to enforce, the crime in question is getting easier and easier to trivially commit, and thus the law likely won't do much.

And btw that case you linked is a hell of a lot more than someone retweeting or upvoting a deepfake.

It covers someone constantly uploading porn of a partner and blackmailing them (even days before the court case), impersonating her online, doxxing her, and sending porn of her to her family members.

It also covers him illegally using her bank account to pay his bills and using her name and information to apply for loans in her name.

That case is a very, very, very, very different situation to someone making a Taylor Swift deepfake.

So different that it calls into question whether you even read past the headline.