I noticed a bit of panic around here lately and as I have had to continuously fight against pedos for the past year, I have developed tools to help me detect and prevent this content.

As luck would have it, we recently published one of our anti-csam checker tool as a python library that anyone can use. So I thought I could use this to help lemmy admins feel a bit more safe.

The tool can either go through all your images via your object storage and delete all CSAM, or it canrun continuously and scan and delete all new images as well. Suggested option is to run it using --all once, and then run it as a daemon and leave it running.

Better options would be to be able to retrieve exact images uploaded via lemmy/pict-rs api but we’re not there quite yet.

Let me know if you have any issue or improvements.

EDIT: Just to clarify, you should run this on your desktop PC with a GPU, not on your lemmy server!

  • snoweA
    link
    English
    010 months ago

    removing inappropriate sexual content is completely different than CSAM… a point you seem to be trying INCREDIBLY hard to not understand. Dude, just quit. Please please stop with this nonsense. You are unable to let it go. Every one of us builds bad software at some point. For you this is it. Just let it go, go build something else.

    Sheesh.

    • db0OP
      link
      fedilink
      English
      1
      edit-2
      10 months ago

      Lol, I really pity your users if that’s what you’re like talking to other admins.

      Look, maybe I can make it very plain for you to finally get. You will not be able to recognise it’s not ai generated and your “proper solution” won’t catch it. Your users will freak the fuck out.

      Use your brain. I’m out

      • snoweA
        link
        English
        110 months ago

        it’s clear from your comments here and in matrix that you think you’re always right and you clearly can’t take criticism. Good luck in the future dude. You’re going to need it.