A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

    • azertyfun@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      9 months ago

      Eeeeeeeh. There’s nuance.

      IIRC there were only a handful of verified CSAM videos on the entire website. It’s inevitable, it happens everywhere with UGC, including on here. Anecdotally, in the years leading up to the purge PH had already cleaned up its act and from what I saw pirated content was rather well moderated. However this time the media made a huge stink about the alleged CSAM, payment processors threatened to pull out (they are notoriously very puritan, it’s caused a lot of trouble to lemmynsfw’s admins for instance) and so regardless of the validity of the initial claims PH had to do something to gain back the trust of payment processors, so they basically nuked every video that did not have a government ID attached.

      Now if I may speculate a little, one of the reasons it happened this way is probably that due to its industry position PH is way better moderated than most (if not all) websites of their size and already had verified a bunch of its creators. At the same time the rise of OnlyFans and similar websites means that real amateur content has all but disappeared so there was less and less reason to allow random UGC anyway. So the high moderation costs probably didn’t make much sense anymore anyway.

      • root@precious.net
        link
        fedilink
        English
        arrow-up
        10
        ·
        9 months ago

        Spot on. The availability of CSAM was overblown by a well funded special interest group (Exodus Cry). The articles about it were pretty much ghost written by them.

        When you’re the biggest company in porn you’ve got a target on your back. In my opinion they removed all user content to avoid even the appearance of supporting CSAM, not because they were guilty of anything.

        PornHub has been very open about normalizing healthy sexuality for years, while also providing interesting data access for both scientists and the general public.

        “Exodus Cry is an American Christian non-profit advocacy organization seeking the abolition of the legal commercial sex industry, including pornography, strip clubs, and sex work, as well as illegal sex trafficking.[2] It has been described by the New York Daily News,[3] TheWrap,[4] and others as anti-LGBT, with ties to the anti-abortion movement.[5]”

        https://en.wikipedia.org/wiki/Exodus_Cry

        • azertyfun@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          9 months ago

          They’re the fuckers who almost turned OF into Pinterest as well? Not surprising in retrospect. The crazy thing is how all news outlets ran with the narrative and payment processors are so flaky with adult content. De-platforming sex work shouldn’t be this easy.

      • CameronDev
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        9 months ago

        Yeah, there was a lot of reasons. CSAM was just the loud reason.