• BombOmOm@lemmy.world
    link
    fedilink
    English
    arrow-up
    90
    arrow-down
    3
    ·
    edit-2
    1 year ago

    Drawn art depicting minors in sexual situations has been deemed protected as free speech in the US. It’s why, at least in the US, you don’t have to worry about the anime girl that’s 17 landing you in prison on child porn charges. The reasoning: there is no victim, the anime girl is not sentient, therefore the creation of that art is protected as free speech.

    I suspect a similar thing will happen with this. As long as it is not depicting a real person, the completely invented person is not sentient, there is no victim, this will fall under free speech. At least in the US.

    However, it is likely a very, very bad idea to have any photo-realistic art of this manner, as it may not be clear to authorities if it is from AI or if there is in fact a person you are victimizing. Doubly so if you download this from someone else, as you don’t know if that is a real person either.

    • fubo@lemmy.world
      link
      fedilink
      English
      arrow-up
      64
      arrow-down
      3
      ·
      edit-2
      1 year ago

      Deepfakes of an actual child should be considered defamatory use of a person’s image; but they aren’t evidence of actual abuse the way real CSAM is.

      Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

      Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)

      But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.

      • Uranium3006@kbin.social
        link
        fedilink
        arrow-up
        11
        arrow-down
        1
        ·
        1 year ago

        Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from depictions not involving actual abuse – such as erotic Harry Potter fanfiction, anime characters, drawings from imagination, and the like.

        although that distinction lasted about a week before the same bad actors who cancel people over incest fanfic started calling all the latter CSEM too

        • fubo@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          arrow-down
          2
          ·
          edit-2
          1 year ago

          As a sometime fanfic writer, I do notice when fascists attack sites like AO3 under a pretense of “protecting children”, yes.

          • Uranium3006@kbin.social
            link
            fedilink
            arrow-up
            6
            ·
            1 year ago

            And it’s usually fascists, or at least people who may not consider themselves as such but think and act like fascists anyways.

      • Pyro@pawb.social
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        5
        ·
        1 year ago

        Add in an extra twist. Hopefully if the sickos are at least happy with AI stuff they won’t need “real”

        Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

        • hoshikarakitaridia@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 year ago

          Sadly, a lot of it does evolve from wanting to “watch” to wanting to do

          This is the part where I disagree and I would love ppl to prove me wrong. Because whether this is true or false, it will probably be the deciding factor in allowing or restricting “artificial CSAM”.

          • JohnEdwa@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 year ago

            Some people are sadists and rapists, yes, regardless of what age group they’d want to do it with.

    • HubertManne@kbin.social
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      1 year ago

      this is sorta a problem in regular porn. Im not sure if the acting improved but sometimes im turned off because im not sure if the acts are not in some way cohereced. Especially given some of the stuff recently with modeling things were they take their passports and shit.

      • gregorum@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        Yeah, I get turned off by porn even if the actors don’t seem all that into it. “Possibly coerced” sets off alarms, although I don’t really run across that hardly ever.

    • BrianTheeBiscuiteer@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      13
      ·
      1 year ago

      Possession of CSAM that’s 20 years old (i.e. the subject is now adult) or even 100 years old (i.e. the subject is likely deceased) is not legal. You don’t have to pay for it or create it, just possess it. Yeah, they’ll find a way to prosecute for images of non-existent children.

    • CptBread@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      18
      ·
      1 year ago

      Anything that looks realistic should be illegal if you ask me as otherwise it would become harder to prosecuting real child porn. “Oh that picture is just modified with AI” could be hard to dissprove…

        • logicbomb@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          1 year ago

          We shouldn’t be prosecuting these people, but we should be figuring out how to get them help.

          An adult person who is attracted to children can obviously not have any legal sexual contact with a child, just like anybody else, and so we need to make sure they have the tools and ability to get by without that.

          I don’t know what’s best for these people. Maybe the best way to help them is to let them have this fake material. Maybe the best way to help them is to try to deny them this sort of material. There’s probably some scientist out there who has studied what is the best thing.

          • Doomsider@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            10
            ·
            1 year ago

            Allowing someone to act out on their deranged fantasies just results in reinforcing this behavior. No, it would not help them.

            We learned in the early eighties that allowing people to scream, tear up stuff, and generally destroy things that it did not help them move past their feelings of anger. If you hit things to deal with anger it becomes a feedback loop of hitting more things more often to deal with the emotion.

      • Supermariofan67
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        Exactly that has already been tried, and struck down by the supreme Court in Ashcroft v. Free Speech Coalition. It turns out, porn of people over 18 very often looks the same as porn of people under 18, therefore such a law bans a considerable amount of legal adult content.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    This is the best summary I could come up with:


    NEW YORK (AP) — The already-alarming proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos, a watchdog agency warned on Tuesday.

    In a written report, the U.K.-based Internet Watch Foundation urges governments and technology providers to act quickly before a flood of AI-generated images of child sexual abuse overwhelms law enforcement investigators and vastly expands the pool of potential victims.

    In a first-of-its-kind case in South Korea, a man was sentenced in September to 2 1/2 years in prison for using artificial intelligence to create 360 virtual child abuse images, according to the Busan District Court in the country’s southeast.

    What IWF analysts found were abusers sharing tips and marveling about how easy it was to turn their home computers into factories for generating sexually explicit images of children of all ages.

    While the IWF’s report is meant to flag a growing problem more than offer prescriptions, it urges governments to strengthen laws to make it easier to combat AI-generated abuse.

    Users can still access unfiltered older versions of Stable Diffusion, however, which are “overwhelmingly the software of choice … for people creating explicit content involving children,” said David Thiel, chief technologist of the Stanford Internet Observatory, another watchdog group studying the problem.


    The original article contains 1,013 words, the summary contains 223 words. Saved 78%. I’m a bot and I’m open source!

  • BarrierWithAshes@kbin.social
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    Same thing is gonna happen (if not already) with animal abuse videos and images. Silver lining is that at least no actual animals are getting hurt but still. Grim.

  • Nurse_Robot@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    17
    ·
    edit-2
    1 year ago

    I think the biggest worry for me at this point is what the AI trained on in order to depict these images. It’s not victimless if it needs victims of child abuse to train on

    Edit: really fucking weird I’m getting down voted for being against AI training on child porn. I’m willing to go down with that ship.

    • Chozo@kbin.social
      link
      fedilink
      arrow-up
      24
      arrow-down
      1
      ·
      1 year ago

      It knows what naked people look like, and it knows what children look like. It doesn’t need naked children to fill in those gaps.

      Also, these models are trained with images scraped from the clear net. Somebody would have to had manually added CSAM to the training data, which would be easily traced back to them if they did. The likelihood of actual CSAM being included in any mainstream AI’s training material is slim to none.

      • BetaDoggo_@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        There is likely some csam in most of the models as filtering it out of a several billion image set is nearly impossible even with automated methods. This material likely has little to no effect on outputs however since it’s likely scarce and was probably tagged incorrectly.

        The bigger concern is users down stream finetuning models on their own datasets with this material. This has been happening for a while, though I won’t point fingers(Japan).

        There’s not a whole lot that can be done about it but I also don’t think there’s anything that needs to be done. It’s already illegal and it’s already removed from most platforms semiautomatically. Having more of it won’t change that.

      • Nurse_Robot@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        14
        ·
        1 year ago

        Defending AI generated child porn is a weird take, and the support you’re receiving is even more concerning

        • Chozo@kbin.social
          link
          fedilink
          arrow-up
          9
          arrow-down
          1
          ·
          1 year ago

          I’m not defending it, dipshit. I’m explaining how generative AI training works.

          The fact that you can’t see that is what’s really concerning.

    • Thorny_Insight@lemm.ee
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      edit-2
      1 year ago

      AI can generate a picture of astronaut riding a horse on the moon. It wasn’t trained on pictures of astronauts riding horses on the moon though.

      really fucking weird I’m getting down voted for being against AI training on child porn

      Because you made that up. It’s not happening.