A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

  • CameronDev@programming.dev
    link
    fedilink
    English
    arrow-up
    87
    arrow-down
    2
    ·
    1 year ago

    That kinda sounds reasonable. Especially if it can prevent someone going down that rabbithole? Good job PH.

  • FinishingDutch@lemmy.world
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    1
    ·
    1 year ago

    Sounds like a good feature. Anything that stops people from doing that is great.

    But I do have to wonder… were people really expecting to find that content on PornHub? That site certainly seems legit enough that I doubt they’d have that stuff on there. I’d imagine most actual content would be on the dark web and specialty groups, not on PH.

    • CameronDev@programming.dev
      link
      fedilink
      English
      arrow-up
      35
      arrow-down
      2
      ·
      1 year ago

      PH had a pretty big problem with CSAM a few years ago, they ended up wiping ~2/3rds of their user submitted content to try fix it. (Note, they wiped all non-verified user submitted videos, not all of it was CSAM).

      And im guessing they are trying to catch users who are trending towards questionable material. “College”✅ -> “Teen”⚠️ -> “Young Teen”⚠️⚠️⚠️ -> "CSAM"🚔 etc.

      • FinishingDutch@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        Wow, that bad? I was aware they purged a lot of ‘amateur’ content over concerns regarding consent to upload/revenge porn, but I didn’t know it was that much.

          • azertyfun@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Eeeeeeeh. There’s nuance.

            IIRC there were only a handful of verified CSAM videos on the entire website. It’s inevitable, it happens everywhere with UGC, including on here. Anecdotally, in the years leading up to the purge PH had already cleaned up its act and from what I saw pirated content was rather well moderated. However this time the media made a huge stink about the alleged CSAM, payment processors threatened to pull out (they are notoriously very puritan, it’s caused a lot of trouble to lemmynsfw’s admins for instance) and so regardless of the validity of the initial claims PH had to do something to gain back the trust of payment processors, so they basically nuked every video that did not have a government ID attached.

            Now if I may speculate a little, one of the reasons it happened this way is probably that due to its industry position PH is way better moderated than most (if not all) websites of their size and already had verified a bunch of its creators. At the same time the rise of OnlyFans and similar websites means that real amateur content has all but disappeared so there was less and less reason to allow random UGC anyway. So the high moderation costs probably didn’t make much sense anymore anyway.

            • root@precious.net
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Spot on. The availability of CSAM was overblown by a well funded special interest group (Exodus Cry). The articles about it were pretty much ghost written by them.

              When you’re the biggest company in porn you’ve got a target on your back. In my opinion they removed all user content to avoid even the appearance of supporting CSAM, not because they were guilty of anything.

              PornHub has been very open about normalizing healthy sexuality for years, while also providing interesting data access for both scientists and the general public.

              “Exodus Cry is an American Christian non-profit advocacy organization seeking the abolition of the legal commercial sex industry, including pornography, strip clubs, and sex work, as well as illegal sex trafficking.[2] It has been described by the New York Daily News,[3] TheWrap,[4] and others as anti-LGBT, with ties to the anti-abortion movement.[5]”

              https://en.wikipedia.org/wiki/Exodus_Cry

              • azertyfun@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                They’re the fuckers who almost turned OF into Pinterest as well? Not surprising in retrospect. The crazy thing is how all news outlets ran with the narrative and payment processors are so flaky with adult content. De-platforming sex work shouldn’t be this easy.

    • Ace! _SL/S@ani.social
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      It had all sorts of illegal things before they purged everyone unverified due to legal pressure

  • Mostly_Gristle@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 year ago

    The headline is slightly misleading. 2.8 million searches were halted, but according to the article they didn’t attempt to figure out how many of those searches came from the same users. So thankfully the number of secret pedophiles in the UK is probably much lower than the headline might suggest.

  • ocassionallyaduck@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    3
    ·
    1 year ago

    This is one of the more horrifying features of the future of generative AI.

    There is literally no stopping it at this stage: AI generated CSAM will be possible soon thanks to systems like SORA.

    This is disgusting and awful. But one part of me hopes it can end the black market of real CSAM content forever. By flooding it with infinite fakes, users with that sickness can look at something that didn’t come from a real child’s suffering. It’s the darkest of silver linings I think, but I spoke with many sexual abuse survivors who feel the same about the loli hentai in Japan, in that it could be an outlet for these individuals instead of them finding their own.

    Dark topics. But I hope to see more actions like this in the future. If pedos can self isolate from IRL interactions and curb their ways with content that harms no one, then everyone wins.

    • gapbetweenus@feddit.de
      link
      fedilink
      English
      arrow-up
      17
      ·
      edit-2
      1 year ago

      The question is if consuming AI cp is helping to regulate the pedophiles behavior or if it’s enabling a progression of the condition. As far as I know that is an unanswered question.

      • ocassionallyaduck@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        5
        ·
        1 year ago

        So your takeaway is I’m… Against AI generative images and thus I “protest too much”

        I can’t tell if you’re pro AI and dislike me, or pro loli hentai and thus dislike.

        Dude, AI images and AI video are inevitable. To pretend that does have huge effects on society is stupid. It’s going to reshape all news media, very quickly. If reddit is 99% AI generated bot spam garbage with no verification of what is authentic, reddit is functionally dead, and we are on a train with no brakes in that direction for most public forums.

  • Kusimulkku@lemm.ee
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    I was wondering what sort of phrases get that notification but mentioning that mind be a bit counterproductive

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I’m not sure if it’s related but as a life-long miniskirt lover I’ve noticed that many sites no longer return results for the term “schoolgirl” and instead you need to search for a “student”

    • Squire1039@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      The MLs have been shown to be extraordinarily good at statistically guessing your words. The words covered are probably comprehensive.

      • Kusimulkku@lemm.ee
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        I think the other article talks about it being a manually curated list because while ML can get correct words it also gets random stuff, so you need to check it isn’t making spurious connections. It’s pretty interesting how it all works

  • FraidyBear@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Imagine a porn site telling you to seek help because you’re a filthy pervert. Thats gotta push some to get some help I’d think.

    • John_McMurray@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Imagine how dumb, in addition to deranged, these people would have to be to look for child porn on a basically legitimate website. Misleading headline too, it didn’t stop anything, it just told them “Not here”

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Did it? Or did it make them look elsewhere?

    The amount of school uniform, braces, pigtails and step-sister porn on Pornhub makes me think they want the nonces to watch.

      • Arsonistic@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I know I got the warning when I searched for young gymnast or something like that cuz I was trying to find a specific video I had seen before. False positives can be annoying, but that’s the only time I’ve ever encountered it.

  • pHr34kY@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    4.4 million sounds a bit excessive. Facebook marketplace intercepted my search for “unwanted gift” once and insisted I seek help. These things have a lot of false positives.

  • Socsa@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Google does this too, my wife was searching for “slutty schoolgirl” costumes and Google was like “have a seat ma’am”

        • gapbetweenus@feddit.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Sexuality is tightly connected to societal taboos, as long as everyone involved is a consenting adult - it’s no-one else businesses. There is no need or benefit in moralizing peoples sexuality.

          • r3df0x ✡️✝☪️@7.62x54r.ru
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            1 year ago

            It’s still weird to sexualize children. It’s less weird when it’s teenagers and everyone is of age but it’s a weird thing to engage in constantly.

    • prole@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Google now gives you links to rehabs and addiction recovery centers when searching for harm reduction information about non-addictive drugs.

  • _cnt0@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    1 year ago

    Non-paywall link: https://web.archive.org/web/20240305000347/https://www.wired.com/story/pornhub-chatbot-csam-help/

    There’s this lingering implication that there is CSAM at Pornhub. Why bother with “searches for CSAM” if it does not return CSAM results? And what exactly constitutes a “search for CSAM”? The article and the linked one are incredibly opaque about that. Why target the consumer and not the source? This feels kind of backwards and like language policing without really addressing the problem. What do they expect to happen if they prohibit specific words/language? That people searching for CSAM will just give up? Do they expect anything beyond them changing the used language and go for a permanent cat and mouse game? I guess I share the sentiments that motivated them to do this, but it feels so incredibly pointless.

  • n3uroh4lt@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    The original report from the researchers can be found here: https://www.iwf.org.uk/about-us/why-we-exist/our-research/rethink-chatbot-evaluation/ Researchers said:

    The chatbot was displayed 2.8 million times between March 2022 and August 2023, resulting in 1,656 requests for more information and Stop It Now services; and 490 click-throughs to the Stop It Now website.

    So from 4.4 million banned queries, only 2.8 million (between the date interval in the quote above) and only 490 clicks to seek help. Ngl, kinda underwhelming. And I also think, given the amount of extremely edgy content already on Pornhub, this is kinda sus.

    • laughterlaughter@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 year ago

      It’s not really that underwhelming. Disclaimer: I don’t condone child abuse. I find it abhorrent, and I will never justify it.

      People have fantasies, though. If a dude searches for “burglar breaks in and has sex with milf,” does that mean that he wants to do this in real life? Of course not (or god I hope not!) So, some people may have searched for “dad has sex with young babysitter” and bam! Bot! Some people have a fetish for diapers - there are tons of porn of adults wearing diapers and having sex. Not my thing, but who am I to judge? So again, someone searches “sex with diapers” and bam! Bot!

      Let’s not forget that as much as pornhub displays a sign saying “Hey, are you 18?” a lot of people will lie. And those young folks will also search for stupid things.

      So I don’t think that aaaaaall 1+ million searches were done by people with actual pedophilia.

      The fact that 1,600 people decided to click and inform themselves, in the UK alone, well, that’s a lot, in my opinion, and it should be something to commend, not to just say “eh. Underwhelming.”

  • interdimensionalmeme@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    17
    ·
    1 year ago

    Incredibly stupid and obviously false “think of the children” propaganda. And you all lap it up. They’re building aroubd you a version of the panopticon so extrene and disgusting that even people in the 1800s would have been outraged to use it against prisoners. Yet you applaud. I think this means you do deserve your coming enslavement.

    • StitchIsABitch@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 year ago

      And, why? I mean it’s nice of you to make these claims, but what the hell does reducing csam searches have to do with the panopticon and us becoming enslaved?

    • RedFox@infosec.pub
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      I keep asking myself why I haven’t blocked lemmy.ml

      I keep telling myself I’ll lose ideas or comments from the good users there…

      At this point, I’ll have just blocked all their users individually

      • Buelldozer@lemmy.today
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        I held off instance filtering lemmy.ml for months for all the reasons you mentioned but I finally gave up I did it 6 weeks ago. It made a marked improvement in my Lemmy experience so I’d advise to just do it.