this post was submitted on 05 Mar 2024
534 points (97.2% liked)

Technology

59052 readers
6622 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub's UK site, with hopes for similar measures across other platforms to create a safer internet environment.

top 50 comments
sorted by: hot top controversial new old
[–] CameronDev@programming.dev 161 points 8 months ago

That kinda sounds reasonable. Especially if it can prevent someone going down that rabbithole? Good job PH.

[–] FraidyBear@lemmy.world 117 points 8 months ago (6 children)

Imagine a porn site telling you to seek help because you're a filthy pervert. Thats gotta push some to get some help I'd think.

[–] John_McMurray@lemmy.world 44 points 8 months ago* (last edited 8 months ago) (4 children)

Imagine how dumb, in addition to deranged, these people would have to be to look for child porn on a basically legitimate website. Misleading headline too, it didn't stop anything, it just told them "Not here"

[–] abhibeckert@lemmy.world 18 points 8 months ago (1 children)

We have culturally drawn a line in the sand where one side is legal and the other side of the line is illegal.

Of course the real world isn't like that - there's a range of material available and a lot of it is pretty close to being abusive material, while still being perfectly legal because it falls on the right side of someone's date of birth.

It sounds like this initiative by Pornhub's chatbot successfully pushes people away from borderline content... I'm not sure I buy that... but if it's directing some of those users to support services then that's a good thing. I worry though some people might instead be pushed over to the dark web.

[–] John_McMurray@lemmy.world 15 points 8 months ago (2 children)

Yeah...I forgot that the UK classifies some activities between consenting adults as "abusive", and it seems some people are now using that definition in the real world.

load more comments (2 replies)
[–] A_Random_Idiot@lemmy.world 14 points 8 months ago* (last edited 8 months ago) (1 children)

I mean, is it dumb?

Didnt pornhub face a massive lawsuit or something because of the amount of unmoderated child porn that was hidden in its bowels by uploaders (in addition to rape victims, revenge porn, etc etc..), to the point that they apparently only allow verified uploaders now and purged a huge swath of their videos?

load more comments (1 replies)
[–] theherk@lemmy.world 12 points 8 months ago (17 children)

Until a few years ago, when they finally stopped allowing unmoderated, user uploaded content they had a ton a very problematic videos. And they were roasted about it in public for years. Including by many who were the unconsenting, sometimes underage subjects of these videos, and they did nothing. Good that they finally did, but they trained users for years that it was a place to find that content.

load more comments (17 replies)
load more comments (1 replies)
[–] squid_slime@lemmy.world 17 points 8 months ago

filthy pervert is down playing it but yea definitely hope to see more of this

load more comments (4 replies)
[–] FinishingDutch@lemmy.world 86 points 8 months ago (4 children)

Sounds like a good feature. Anything that stops people from doing that is great.

But I do have to wonder… were people really expecting to find that content on PornHub? That site certainly seems legit enough that I doubt they’d have that stuff on there. I’d imagine most actual content would be on the dark web and specialty groups, not on PH.

[–] CameronDev@programming.dev 71 points 8 months ago (2 children)

PH had a pretty big problem with CSAM a few years ago, they ended up wiping ~2/3rds of their user submitted content to try fix it. (Note, they wiped all non-verified user submitted videos, not all of it was CSAM).

And im guessing they are trying to catch users who are trending towards questionable material. "College"✅ -> "Teen"⚠️ -> "Young Teen"⚠️⚠️⚠️ -> "CSAM"🚔 etc.

[–] macrocephalic@lemmy.world 30 points 8 months ago (3 children)

That explains why it's all commercial stuff now.... So I heard.

load more comments (3 replies)
[–] FinishingDutch@lemmy.world 18 points 8 months ago (1 children)

Wow, that bad? I was aware they purged a lot of ‘amateur’ content over concerns regarding consent to upload/revenge porn, but I didn’t know it was that much.

[–] CameronDev@programming.dev 33 points 8 months ago (1 children)

Yeah, unverified user content had a lot of problems. Also piracy and gore etc.

https://arstechnica.com/tech-policy/2020/12/pornhub-purges-all-unverified-user-uploads-in-wake-of-abuse-allegations/

The purge appears to have hit almost 9 million of the 13.5 million videos on Pornhub as of Sunday, or nearly two-thirds of all the content hosted on the site.

[–] azertyfun@sh.itjust.works 16 points 8 months ago (3 children)

Eeeeeeeh. There's nuance.

IIRC there were only a handful of verified CSAM videos on the entire website. It's inevitable, it happens everywhere with UGC, including on here. Anecdotally, in the years leading up to the purge PH had already cleaned up its act and from what I saw pirated content was rather well moderated. However this time the media made a huge stink about the alleged CSAM, payment processors threatened to pull out (they are notoriously very puritan, it's caused a lot of trouble to lemmynsfw's admins for instance) and so regardless of the validity of the initial claims PH had to do something to gain back the trust of payment processors, so they basically nuked every video that did not have a government ID attached.

Now if I may speculate a little, one of the reasons it happened this way is probably that due to its industry position PH is way better moderated than most (if not all) websites of their size and already had verified a bunch of its creators. At the same time the rise of OnlyFans and similar websites means that real amateur content has all but disappeared so there was less and less reason to allow random UGC anyway. So the high moderation costs probably didn't make much sense anymore anyway.

load more comments (3 replies)
[–] tordenflesk@lemmy.world 14 points 8 months ago

I think it's an early prevention type of thing.

load more comments (2 replies)
[–] Mostly_Gristle@lemmy.world 69 points 8 months ago (1 children)

The headline is slightly misleading. 2.8 million searches were halted, but according to the article they didn't attempt to figure out how many of those searches came from the same users. So thankfully the number of secret pedophiles in the UK is probably much lower than the headline might suggest.

[–] preasket@lemy.lol 74 points 8 months ago (2 children)

I suspect a lot of CSAM searches come from underage users themselves

[–] BaardFigur@lemmy.world 37 points 8 months ago* (last edited 8 months ago) (1 children)

I was one of them. I used to search "Naked girl X age", where X was my age, from the time I was around 13.

[–] lemmylem@lemm.ee 21 points 8 months ago* (last edited 8 months ago)

Same thing for me when I was 13. I freaked the fuck out when I saw a wikipedia article on the right. I thought I was going to jail the next day lmfao

[–] Dran_Arcana@lemmy.world 32 points 8 months ago (3 children)

I'd think it's probably not a majority, but I do wonder what percentage it actually is. I do have distinct memories of being like 12 and trying to find porn of people my own age instead of "gross old people" and being confused why I couldn't find anything. Kids are stupid lol, that's why laws protecting them need to exist.

Also good god when I become a parent I am going to do proper network monitoring; in hindsight I should not have been left unattended on the internet at 12.

[–] kylian0087@lemmy.world 14 points 8 months ago

I was the same back then. And have come across some stuff which is surprisingly easy to find. Later to realize how messed up that was.

I think monitoring is good but it has a fine line not to cross in your child privacy. If they suspect anything they sure know how to work around it and you loose any insight.

load more comments (2 replies)
[–] pHr34kY@lemmy.world 52 points 8 months ago (7 children)

4.4 million sounds a bit excessive. Facebook marketplace intercepted my search for "unwanted gift" once and insisted I seek help. These things have a lot of false positives.

load more comments (7 replies)
[–] Socsa@sh.itjust.works 49 points 8 months ago (2 children)

Google does this too, my wife was searching for "slutty schoolgirl" costumes and Google was like "have a seat ma'am"

[–] prole@sh.itjust.works 21 points 8 months ago (4 children)

Google now gives you links to rehabs and addiction recovery centers when searching for harm reduction information about non-addictive drugs.

load more comments (4 replies)
[–] gapbetweenus@feddit.de 19 points 8 months ago (11 children)

Big tech is teaching us about morality.

load more comments (11 replies)
[–] _cnt0@sh.itjust.works 43 points 8 months ago (18 children)

Non-paywall link: https://web.archive.org/web/20240305000347/https://www.wired.com/story/pornhub-chatbot-csam-help/

There's this lingering implication that there is CSAM at Pornhub. Why bother with "searches for CSAM" if it does not return CSAM results? And what exactly constitutes a "search for CSAM"? The article and the linked one are incredibly opaque about that. Why target the consumer and not the source? This feels kind of backwards and like language policing without really addressing the problem. What do they expect to happen if they prohibit specific words/language? That people searching for CSAM will just give up? Do they expect anything beyond them changing the used language and go for a permanent cat and mouse game? I guess I share the sentiments that motivated them to do this, but it feels so incredibly pointless.

[–] TheBlackLounge@lemm.ee 22 points 8 months ago (43 children)

Lolicon is not illegal, and neither is giving your video a title that implies CSAM.

That begs the question, what about pedophiles who intentionally seek out simulated CP to avoid hurting children?

[–] SquiffSquiff@lemmy.world 21 points 8 months ago (4 children)

Simulated CP is legally considered the same as 'actual' CP in the UK

load more comments (4 replies)
load more comments (42 replies)
load more comments (17 replies)
[–] Blackmist@feddit.uk 36 points 8 months ago (3 children)

Did it? Or did it make them look elsewhere?

The amount of school uniform, braces, pigtails and step-sister porn on Pornhub makes me think they want the nonces to watch.

[–] Gradually_Adjusting@lemmy.world 13 points 8 months ago (2 children)

I miss the days when you just didn't see that shit around.

load more comments (2 replies)
[–] BleatingZombie@lemmy.world 12 points 8 months ago (1 children)

Also, I'm curious about false positives

[–] Blackmist@feddit.uk 13 points 8 months ago (1 children)

I kind of want to trigger it to see what searches it reacts to, but at the same time I don't want my IP address on a watchlist.

load more comments (1 replies)
load more comments (1 replies)
[–] ocassionallyaduck@lemmy.world 22 points 8 months ago (10 children)

This is one of the more horrifying features of the future of generative AI.

There is literally no stopping it at this stage: AI generated CSAM will be possible soon thanks to systems like SORA.

This is disgusting and awful. But one part of me hopes it can end the black market of real CSAM content forever. By flooding it with infinite fakes, users with that sickness can look at something that didn't come from a real child's suffering. It's the darkest of silver linings I think, but I spoke with many sexual abuse survivors who feel the same about the loli hentai in Japan, in that it could be an outlet for these individuals instead of them finding their own.

Dark topics. But I hope to see more actions like this in the future. If pedos can self isolate from IRL interactions and curb their ways with content that harms no one, then everyone wins.

[–] gapbetweenus@feddit.de 34 points 8 months ago* (last edited 8 months ago) (6 children)

The question is if consuming AI cp is helping to regulate the pedophiles behavior or if it's enabling a progression of the condition. As far as I know that is an unanswered question.

load more comments (9 replies)
[–] LodeMike@lemmy.today 21 points 8 months ago

Oh just like an experiment the headline made me think someone was suing over this.

[–] TIMMAY@lemmy.world 14 points 8 months ago (4 children)

You can just encounter shit like that on phub?

[–] Gabu@lemmy.world 17 points 8 months ago

Not since the wipe, AFAIK. Still, at the bottom of the page you can (or at least could, haven't used their services in a while) see a list of recent searches from all users, and you'd often find some disturbing shit.

[–] BowtiesAreCool@lemmy.world 15 points 8 months ago (2 children)

If you read the paragraph thats literally right there it says when certain terms were searched by the user.

load more comments (2 replies)
load more comments (2 replies)
[–] Kusimulkku@lemm.ee 14 points 8 months ago (8 children)

I was wondering what sort of phrases get that notification but mentioning that mind be a bit counterproductive

load more comments (8 replies)
[–] n3uroh4lt@lemmy.ml 14 points 8 months ago (1 children)

The original report from the researchers can be found here: https://www.iwf.org.uk/about-us/why-we-exist/our-research/rethink-chatbot-evaluation/ Researchers said:

The chatbot was displayed 2.8 million times between March 2022 and August 2023, resulting in 1,656 requests for more information and Stop It Now services; and 490 click-throughs to the Stop It Now website.

So from 4.4 million banned queries, only 2.8 million (between the date interval in the quote above) and only 490 clicks to seek help. Ngl, kinda underwhelming. And I also think, given the amount of extremely edgy content already on Pornhub, this is kinda sus.

load more comments (1 replies)
load more comments
view more: next ›