this post was submitted on 26 Jul 2024
750 points (98.6% liked)

News

23399 readers
4584 users here now

Welcome to the News community!

Rules:

1. Be civil


Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban. Do not respond to rule-breaking content; report it and move on.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.


Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.


Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.


Posts must be news from the most recent 30 days.


6. All posts must be news articles.


No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.


If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.


Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.


The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body


For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

founded 1 year ago
MODERATORS
 

THE SENATE UNANIMOUSLY passed a bipartisan bill to provide recourse to victims of porn deepfakes — or sexually-explicit, non-consensual images created with artificial intelligence

The legislation, called the Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act — passed in Congress’ upper chamber on Tuesday.  The legislation has been led by Sens. Dick Durbin (D-Ill.) and Lindsey Graham (R-S.C.), as well as Rep. Alexandria Ocasio-Cortez (D-N.Y.) in the House.

The legislation would amend the Violence Against Women Act (VAWA) to allow people to sue those who produce, distribute, or receive the deepfake pornography, if they “knew or recklessly disregarded” the fact that the victim did not consent to those images.

you are viewing a single comment's thread
view the rest of the comments
[–] j4k3@lemmy.world 65 points 4 months ago* (last edited 4 months ago) (5 children)

Things have already been moving towards nobody models. I think this will eventually have the consequence of nobodies becoming the new somebodies as this will result in a lot of very well developed nobodies and move the community into furthering their development instead of the deepfake stuff. You'll eventually be watching Hollywood quality feature films full of nobodies. There is some malicious potential with deep fakes, but the vast majority are simply people learning the tools. This will alter that learning target and the memes.

[–] aesthelete@lemmy.world 58 points 4 months ago (1 children)

You’ll eventually be watching Hollywood quality feature films full of nobodies.

With the content on modern streaming services, I've been doing this for a while already.

[–] xmunk@sh.itjust.works 11 points 4 months ago

Oh damn, you should try switching to Dropout.

[–] Cosmos7349@lemmy.world 12 points 4 months ago (1 children)

That's where the money is, so yes that's where the majority of work is. But I do think one of the drivers of this is to help protect more local instances; to create consequences for things like fake revenge porn or distributing deepfakes of classmates/teachers in your school, etc.

[–] j4k3@lemmy.world 2 points 4 months ago (1 children)

This might make the path to generating slightly harder, but it won't do anything to stop an intelligent person. I haven't seen a ton of info from people talking about this stuff, but exploring on my own, especially with Stable Diffusion 3, diffusion models are very different than LLM's. The filtering for safety type alignment is happening external to the model using CLIP, and in the case of SD3, 2× CLIP models and a T5xxl LLM model. The alignment filters are done with these and some trickery. Screwing with these can enable all kinds of capabilities. It is just hard to understand the effect of some tricks, like SD3 swaps an entire layer in the T5 manually. When these mechanisms are defeated, models can generate freely, which essentially means everything is a deepfake. This is open source. So it can never be extinguished. There was a concerted effort to remove the rogue 4chanGPT. It does not have the ChatGPT derived alignment like all other models. The 4chanGPT is still readily available if you know where to look.

This bill just raises the barrier of entry and makes such content less familiar and more powerful in the end. In reality, we would be socially stigmatizing while accepting the new reality IMO. This is like an weapons arms race. You may not like that the enemy created cannons, but banning the casting of cannons within your realm will do nothing to help you in the end. Everyone in the realm may understandably hate cannons, but you really need everyone familiar with casting, making the and everyone in your realm to learn how to deal with them and what to expect. The last thing you need is a lot of ignorant people on a battlefield bunching up together because they do not understand their opponents.

These tools are also weapons. Everyone needs to understand what is truly possible, regardless of how unpleasant that may seem. They can not have a healthy skepticism without familiarity. If they do not have familiarity, they will bunch up on a battlefield facing cannons loaded with grapeshot.

[–] Cosmos7349@lemmy.world 6 points 4 months ago (1 children)

So I haven't dug deeply into the actual legislation, so someone correct me if I'm misinformed... but my understanding is that this isn't necessarily trying to raise the bar for using the technology as much as much as trying to make clearer legal guidelines for victims to have legal recourse. If we were to relate it to other weapons, it's like creating the law "it's illegal to shoot someone with a gun".

[–] j4k3@lemmy.world 2 points 4 months ago

I have not dug deeply either, but have noticed that Civitai has shifted their wording and hosting in ways that indicated a change was coming. In practice, the changes will come from the model hosting sites for open source tools limiting their liability and not hosting content related to real humans.

My main concern is the stupid public reacting to some right wing fake and lacking appropriate skepticism. Like expecting detection tools to be magical and understanding the full spectrum of possibilities.

[–] bradorsomething@ttrpg.network 10 points 4 months ago (3 children)

Of course, eventually somebody will look exactly like the nobody, so the owners of the nobody will sue that somebody to block them from pretending to be that nobody in videos.

[–] j4k3@lemmy.world 2 points 4 months ago

Erase yo face!

[–] paraphrand@lemmy.world 1 points 4 months ago

Who’s on first?

[–] thisbenzingring@lemmy.sdf.org 1 points 4 months ago (1 children)

They'll base their nobody on dead people

[–] xmunk@sh.itjust.works 2 points 4 months ago

Finally, I can indulge my Boudica X Lincoln fantasies!

[–] ColeSloth@discuss.tchncs.de 8 points 4 months ago (1 children)

I see it eventually happening in porn, but it isn't happening in major motion pictures any time too soon. People won't follow and actively go see a cgi actor in a movie like they would a real one. Hollywood pays big money for A list stars because those people get asses in seats. That, along with tech not being there quite yet, and Hollywood locked under sag contract to only do certain things with AI all adds up to nobody's being a ways off.

[–] FlyingSquid@lemmy.world 9 points 4 months ago (1 children)

I also remember years ago reading scare headlines about how the CG in Final Fantasy: The Spirits Within was so realistic that it would be the end of Hollywood actors, they would all be CG.

So I'll take that claim with a huge grain of salt.

[–] felbane@lemmy.world 2 points 4 months ago (1 children)

To be fair, The Spirits Within was pretty amazing especially considering when it was made. I briefly had the same thought while I was still in awe of the realism, so I can definitely understand why people would believe those headlines.

I can imagine a scenario where a real voice actor is paired with a particular model and used in multiple films/roles. That's not that dissimilar to Mickey Mouse or any other famous animated character.

The idea that AICG actors will completely replace live actors is clearly ridiculous, but the future definitely has room for fully visually- and vocally-AICG personalities as film "stars" alongside real people.

[–] FlyingSquid@lemmy.world 4 points 4 months ago

What I am saying is that I think we are a much longer way away from believable photrealistic CG actors than you think. As far as I know, microexpressions have not even been considered, let alone figured out. Microexpressions are really important.

https://en.wikipedia.org/wiki/Microexpression

That's why I think the uncanny valley problem is a lot harder to overcome than people believe.

[–] todd_bonzalez@lemm.ee 2 points 4 months ago (1 children)

You're missing the point. There may be lots of Nobody models out there that satisfy just about every possible aesthetic or body preference a person could have, and you can use them fairly guilt-free to make content that you want to see. I understand the appeal, and it is far preferable to using real people's likenesses without their consent.

But that doesn't scratch the itch people want to scratch. People grew up reading and watching Harry Potter and fantasized about fucking Hermione. They developed a crush on Emma Watson during their formative years, and it never went away completely. People want to see pervy pictures of her, specifically, and AI has given them broad access.

People have crushes, and their crushes have Instagram feeds. Crop out 60-100 good face shots and and a mid-grade gaming PC can churn out a LoRA or TI overnight. Then they'll be generating the nudes they wish their crushes were texting them.

And some people are just fucked up. They hate women, and they want to exert power over them. Any woman with enough Instagram photos is a potential target, where deepfakes and nudifed photos are trivial to produce, and then attempt to use to humilate or blackmail a woman online.

People are already nudifying picture of kids, and then DMing those kids the photos, blackmailing them for actual nudes, and then escalating from their once they've trapped some vulnerable child in an abusive cycle.

There is little to nothing we can do to stop this technology from being available, but we absolutely need to lay the groundwork for holding people responsible, civilly and criminally, for abusing this technology to hurt people.

[–] j4k3@lemmy.world -1 points 4 months ago

I don't feel like arguing, but AI has not enabled people like this. It is simply shown a spotlight on the true spectrum of people.

Personally, I realized what fixation was and what it meant in regards to sexuality at around 12 years old. It is an adolescent behavior. There is nothing wrong with knowing examples of what you find attractive. It is better to understand them, and if the person is mature enough, to break down their fundamental psychology. If you look up scientific definitions of beauty, you will land in the realm of neoteny in humans. Youthful appeal is a current leading hypothesis about how homo sapiens evolved in the first place as part of the explanation for why sexual maturity happens so early compared to cognitive maturity, and why girls mature before boys. There is a lot more underpinning the appeal of what one sees when they are younger. I believe for the vast majority of humans, it is a protection for them to understand these elements of the true human psyche despite the present cultural taboos. If there is a safe outlet, and a person is made fully aware of what they do and do not find attractive, free from any external bias, that knowledge enables them to better understand dangerous situations where they need to be on guard.

My aunt was raped, as was someone I dated. Both were from drunken family in unexpected situations where they made terrible choices. I find it repulsive to think about hurting someone for the pointlessness of sex. It's a 5 second drug hit in the brain; so the fuck what. There are way more effective drugs, and a fucking hand does a better job 9 times out of 10 if you're really honest about it. I'm for the cultural shift of normalizing the fact that humans like youthful appeal, not because I want to have kiddie porn, but because I want people to thoroughly understand the taboo is more than just a rule, and if they find something attractive that is dangerous, they should be aware of everything involved. Don't call them perverted or messed up. They are like someone with an addiction bad for their health like drinking, smoking, or harder drugs. They need to avoid circumstances where those drugs are present. Like if there are a bunch of kids at some house party, I don't want someone feeling weird and crazy because they see some kid and think they are aesthetically pleasing. I want them to think that is perfectly normal human thought. What is not normal is fucking up that kid's life by taking actions. They may not ever imagine doing something like that, but get them very drunk, and stumbling into an unexpected circumstance and really bad things happen in real life.

Like, in the bike shops I was a Buyer for, I was often asked why I carried so many low security locks for bikes. It's simple; the goal is to keep honest people honest; to not give them an unexpected opportunity.

Nothing about this bill or AI is going to enable or change human behavior. The intelligence curve will always have more people at the bottom. They are going to fixate on stupid things. At least celebrities knew they were putting themselves in the public spotlight and at least they have the money to escape the reach of most if the crazies.

If anything, this could be the safe catalyst necessary for English culture to directly address and curb whole kiddie nonsense to a much better degree and resolution that actually impacts behaviors without ostracism and the adolescent mentality of blind repulsion.

I think everything happening as it is presently is the safest option and opportunity. If you take away the celebrity outlet, now the really fucked up psycho starts putting cameras in a temp rental, bathroom toilet, fitting room receptacle, or obsessing over their neighborhood kids. People will always find an outlet. We should be talking to them about their cognitive dissonance and underlying conflict so that they can be helped before they act on many of these behaviors, not because of the appeal of kids, but because of the unconscious counter motivations created by unresolved conflict that manifest as cognitive dissonance. If the person is not a condemned lost cause they are far more likely to learn or seek help.

So to me, this is a disproportionate reaction to something that ultimately has little to no impact. Nothing new has been instantiated; only a spotlight on what was already there. In this instance, we could have used it to grow and mature, but we largely fail to do so. We choose to continue our adolescent rampage and choose to continue with more people getting hurt as a result. Stupid people oversimplify and fail to question their biases in honest depth. I hate to see the effects and damages this blind spot causes to people I love.