Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • @[email protected]
    link
    fedilink
    11 year ago

    That Wikipedia broader is about CP, a broader topic. Practically zero authorities will include illustrated and simualated forms of CP in their definitions of CSAM

    • @[email protected]
      link
      fedilink
      1
      edit-2
      1 year ago

      I assumed it was the same thing, but if you’re placing the bar of acceptable content below child porn, I don’t know what to tell you.

      • @[email protected]
        link
        fedilink
        11 year ago

        That’s not what I was debating. I was debating whether or not it should be reported to authorities. I made it clear in my previous comment that it is disturbing and should always be defederated.

        • @[email protected]
          link
          fedilink
          11 year ago

          Ah. It depends on the jurisdiction the instance is in

          Mastodon has a lot of lolicon shit in japan-hosted instances for that reason

          Lolicon is illegal under US protect act of 2003 and in plenty of countries