Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

  • thySatannic@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Wait… why is no access to csam hashes a good thing? Wouldn’t it make it easier to detect if hashes were public?! I feel like I’m missing something here…

    • snoweA
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Giving access to CSAM hashes means anyone wanting to avoid detection simply has to check what they’re about to upload against the db. If it matches then they simply modify the image until it doesn’t. It’s literally guaranteed to make the problem worse, not better.

      • sarmale@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Question, from what I saw it seems like every CSAM image ever is assigned a new hash. Isnt it unscalable to asign a separate hash for everything? does that mean that most CSAM images were detected before?