this post was submitted on 08 Dec 2023
320 points (93.7% liked)

Technology

59174 readers
2401 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.

you are viewing a single comment's thread
view the rest of the comments
[–] Eezyville@sh.itjust.works 10 points 11 months ago (2 children)

I agree with you nudity being an issue but I think the real problem is this app being used on children and teenagers who aren't used to/supposed to be sexualized.

[–] deft@ttrpg.network 5 points 11 months ago (1 children)

Fully agree but I do think that's more an issue about psychology in our world and trauma. Children being nude should not be a big deal, they're kids you know?

[–] Eezyville@sh.itjust.works 6 points 11 months ago

It shouldn't be a big deal if they choose to be nude some place that is private for them and they're comfortable. The people who are using this app to make someone nude isn't really asking for consent. And that also brings up another issue. Consent. If you have images of yourself posted to the public then is there consent needed to alter those images? I don't know but I don't think there is since it's public domain.

[–] lolcatnip@reddthat.com 5 points 11 months ago (4 children)

Nudity shouldn't be considered sexual.

[–] TORFdot0@lemmy.world 12 points 11 months ago (1 children)

Not all nudity is but there is no non-sexual reason to use AI to undress someone without consent

[–] Eezyville@sh.itjust.works 3 points 11 months ago

The question on consent is something I'm trying to figure out. Do you need consent to alter an image that is available in a public space? What if it was you who took the picture of someone in public?

[–] criticalthreshold@lemmy.world 1 points 11 months ago
[–] Pyr_Pressure@lemmy.ca 1 points 11 months ago* (last edited 11 months ago) (1 children)

Just because something shouldn't be doesn't mean It won't be. This is reality and we can't just wish something to be true. You saying it doesn't really help anything.

[–] lolcatnip@reddthat.com -2 points 11 months ago* (last edited 11 months ago) (1 children)

Whoooooosh.

In societies that have a healthy relationship with the human body, nudity is not considered sexual. I'm not just making up fantasy scenarios.

[–] mossy_@lemmy.world 5 points 11 months ago (1 children)

so because it's not a problem in your culture it's not a problem?

[–] lolcatnip@reddthat.com -2 points 11 months ago (1 children)

You're just really looking for an excuse to attack someone, aren't you?

[–] mossy_@lemmy.world 0 points 11 months ago

You caught me, I'm an evil villain who preys on innocent lemmings for no reason at all

[–] gun@lemmy.ml -2 points 11 months ago (1 children)

Take it up with God or evolution then

[–] lolcatnip@reddthat.com 1 points 11 months ago (1 children)

You can't really be that stupid.

[–] gun@lemmy.ml -3 points 11 months ago