Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

  • flatbield@beehaw.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 year ago

    They will still need to have a developer set this up and presumably it should be added as an option to the main code base. I thought I heard the beehaw admins were not developers.

    There are a number of other issues that are driving the admins to dump lemmy. Same applies there.

    • snoweA
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Not sure what you mean. You do not need to be a developer to set up CloudFlare’s CSAM detection. You simply have email the NCMEC, get an account, then check a box in CF, input some information about your NCMEC account, and then you’re good to go.

      • flatbield@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        How does the scan happen? It has to be linked in some how. Are you saying that choosing cloudflair as your CDN that will flag at distribution time? Or at upload time?

        • snoweA
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          If you use CloudFlare as your proxy then all your instances traffic gets routed through CF before ever making it to your server. If someone tries to upload CSAM it will immediately be flagged (before ever making it to your server). CloudFlare then quarantines it and automatically files a report with the National Center for Missing and Exploited Children. There’s more to the prices, but the point is that putting it in the lemmy software is not a good solution, especially when industry standard proven solutions already exist. You don’t have to use CF. You can also use solutions from Google, FB, Microsoft, Thorn, etc.