163
this post was submitted on 07 Sep 2023
163 points (96.0% liked)
Technology
59346 readers
7298 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Even if the imagery was 100 computer-generated without a training model based in abuse, at some point it becomes a line between "is somebody being hurt" and "what is acceptable in civilized society". If we accept CG CSAM then what else? Gore porn? Snuff? Bestiality? How about a child being sexually assaulted and mutilated by animals at the same time? There is always stuff that's going to push you envelope further and further, and how do you even tell if real individuals are involved, if it's CG, or some combination of such?
If they're all CG, then nobody real is being hurt BUT there's still gotta be a line between acceptable and unacceptable. Society - at least western society- by and large has decided that the majority of those categories are not acceptable by law, so regardless of how it's made it's still illegal, and the dissemination of such can still have a harmful effect on society in general.
There's a very real argument that it does mess with the consumer's head. The person being hurt is the consumer of the media and while some unhealthy behaviors can be okay in reason, there's also risks in terms of how it makes the consumer view the world and people in it. Violent video games don't make people more violent, but imagery in porn does affect how consumers of porn view people and relationships.
I think this is a valid point, and there may indeed be a line in terms of what kind of graphic AI-gen image can or cannot be freely distributed. This should be based on evidence-supported research with the goal of reducing harmful behavior, even if our gut instinct is that banning these things outright is the best way to reduce harm. We need to follow the science.
My personal opinion is stronger when it comes to AI-generated nude images of minors in non-sexual situations. Again, any decisions we make should be evidence-based, but I suspect we will find that the prohibition of nude imagery of children makes the risks of pedophilia and sexual assault against children greater, not reduced.
Imagery of nude bodies in non-sexual situations tends to be considered as sexual because of its irregularity. In societies where nudity is normalized, nude bodies are not considered sexual in non-sexual contexts and indeed, countries with this kind of culture tend to exhibit lower levels of sexual deviancy. In America in particular, nudity is very taboo and child nudity even more so. This Puritanical belief that hiding bodies protects people from sexual urges is, I believe, misplaced. To the contrary, it tends to fetishize the nude form, especially that of minors developing secondary sexual characteristics.
This ratcheting effect of hiding child nudity more and more has led to a reality where our society as a whole cannot break free of the prohibition without putting our most vulnerable populace at severe risk. In other words, anyone attempting to photograph nude children and distribute those photographs is both committing an abusive act against those children and likely causing harmful fetishes to emerge in themselves as well as those who are on the other end of the distribution chain. Simply put, none of the adults in that situation are likely to be taking part in an effort to decrease the sexualization of minors.
Now we have AI, where as others have mentioned, images can be generated basically through guesswork, combining known information about what humans look like at various ages, using drawn images to fill in the gaps. Nude imagery of underage people can be made without anyone being harmed and without any sexualization of the image itself. People who grow up in oversexualized cultures will inevitably project their sexuality onto those images, but by having access to sufficiently realistic simulated nudity, the idea is that over time they would become desensitized. The playing field between adults and children would be leveled, hopefully making underage nude imagery no longer a thing that any significant portion of society covets.
And in an ideal situation, the next generation would grow up already having access to this kind of imagery, never developing a fetish around pubescent or prepubescent nudity in the first place. And hopefully this added comfort around nudity would serve to curb some of their overstimulation when they, as adults, fantasize about seeing the naked form. It would be more about finding a partner that suits them, not about fulfilling a desire to see the rarest most prohibited sights. And we can finally start to move beyond base objectification of women. Even if that seems virtually impossible from where we currently stand.
So, even if it touches a nerve for most of us, to be talking about this, I think we have to discuss this for our own good. We may finally have a silver bullet for a problem that's been plaguing society for uncountable generations. And a puritanical knee-jerk reaction may be about to cause us to throw it away. Before we do so, we need to think really carefully to make sure we're not doing it for all the wrong reasons.
We already have that line though. Beheading photos, for example, arent illegal, but they are banned from most websites.