• TheGrandNagus@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    edit-2
    9 hours ago

    I think another major point to consider going forward is if it is problematic if people can generate all sorts of illegal stuff. If it is AI generated it is a victimless crime, so should it be illegal? I personally feel uncomfortable with the thought of several things being legal, but I can’t logically argue for it being illegal without a victim.

    I’ve been thinking about this recently too, and I have similar feelings.

    I’m just gonna come out and say it without beating around the bush: what is the law’s position on AI-generated child porn?

    More importantly, what should it be?

    It goes without saying that the training data absolutely should not contain CP, for reasons that should be obvious to anybody. But what if it wasn’t?

    If we’re basing the law on pragmatism rather than emotional reaction, I guess it comes down to whether creating this material would embolden paedophiles and lead to more predatory behaviour (i.e. increasing demand), or whether it would satisfy their desires enough to cause a substantial drop in predatory behaviour (I.e. lowering demand).

    And to know that, we’d need extensive and extremely controversial studies. Beyond that, even in the event allowing this stuff to be generated is an overall positive (and I don’t know whether it would or won’t), will many politicians actually call for this stuff to be allowed? Seems like the kind of thing that could ruin a political career. Nobody’s touching that with a ten foot pole.

    • Nighed@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      16 minutes ago

      I think the concern is that although it’s victimless, if it’s legal it could… Normalise (within certain circles) the practice. This might make the users more confident to do something that does create a victim.

      Additionally, how do you tell if it’s really or generated? If AI does get better, how do you tell?

    • General_Effort@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 hours ago

      what is the law’s position on AI-generated child porn?

      Pretend underage porn is illegal in the EU and some other countries. I believe, in the US it is protected by the first amendment.

      Mind that when people talk about child porn or CSAM that means anything underage, as far as politics is concerned. When two 17-year-olds exchange nude selfies, that is child porn. There were some publicized cases of teens in the US being convicted as pedophile sex offenders for sexting.

    • WeirdGoesPro@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      11
      ·
      19 hours ago

      It’s so much simpler than that—it can be created now, so it will be. They will use narrative twists to post it on the clearnet, just like they do with anime (she’s really a 1000 year old vampire, etc.). Creating laws to allow it are simply setting the rules of the phenomenon that is already going to be happening.

      The only question is whether or not politicians will stop mud slinging long enough to have an adult conversation, or will we just shove everything into the more obscure parts of the internet and let it police itself.

    • KillingTimeItself@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      14 hours ago

      what is the law’s position on AI-generated child porn?

      the simplest possible explanation here, is that any porn created based on images of children, is de facto illegal. If it’s trained on adults explicitly, and you prompt it for child porn, that’s a grey area, probably going to follow precedent for drawn art, rather than real content.

    • michaelmrose@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      5
      ·
      18 hours ago

      Let’s play devils advocate. You find Bob the pedophile with pictures depicting horrible things. 2 things are true.

      1. Although you can’t necessarily help Bob you can lock him up preventing him from doing harm and permanently brand him as a dangerous person making it less likely for actual children to be harmed.

      2. Bob can’t claim actual depictions of abuse are AI generated and force you to find the unknown victim before you can lock him and his confederates up. If the law doesn’t distinguish between simulated and actual abuse then in both cases Bob just goes to jail.

      A third factor is that this technology and the inherent lack of privacy on the internet could potentially pinpoint numerous unknown pedophiles who can even if they haven’t done any harm yet be profitably persecuted to societies ultimate profit so long as you value innocent kids more than perverts.

      • shalafi@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        16 hours ago

        Am I reading this right? You’re for prosecuting people who have broken no laws?

        I’ll add this; I have sexual fantasies (not involving children) that would be repugnant to me IRL. Should I be in jail for having those fantasies, even though I would never act on them?

        This sounds like some Minority Report hellscape society.

        • Clent@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          14 hours ago

          Correct. This quickly approaches thought crime.

          What about an AI gen of a violent rape and murder. Shouldn’t that also be illegal.

          But we have movies that have protected that sort of thing for years; graphically. Do those the become illegal after the fact?

          And we also have movies of children being victimized so do these likewise become illegal?


          We already have studies that show watching violence does not make one violent and while some refuse to accept that, it is well established science.

          There is no reason to believe the same isn’t true for watching sexual assault. There are been many many movies that contain such scenes.

          But ultimately the issue will become that there is no way to prevent it. The hardware to generate this stuff is already in our pockets. It may not be efficient but it’s possible and efficiency will increase.

          The prompts to generate this stuff are easily shared and there is no way to stop that without monitoring all communication and even then I’m sure work around would occur.

          Prohibition requires society sacrifice freedoms and we have to decide what weee willing to sacrifice here because as we’ve seen with or prohibitions, once we unleash the law on one, it can be impossible to undo.

          • michaelmrose@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            6
            ·
            12 hours ago

            Ok watch adult porn then watch a movie in which women or children are abused. Note how the abuse is in no way sexualized exactly opposite of porn. It often likely takes place off screen and when rape in general appears on screen between zero and no nudity co-occurs. For children it basically always happens off screen.

            Simulated child abuse has been federally illegal for ~20 years in the US and we appear to have very little trouble telling the difference between prosecuting pedos and cinema even whilst we have struggled enough with sexuality in general.

            But ultimately the issue will become that there is no way to prevent it.

            This argument works well enough for actual child porn. We certainly don’t catch it all but every prosecution takes one more pedo off the streets. The net effect is positive. We don’t catch most car thieves either and nobody suggests we legalize car theft.

        • michaelmrose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          8
          ·
          edit-2
          13 hours ago

          Am I reading this right? You’re for prosecuting people who have broken no laws?

          No I’m for making it against the law to simulate pedophile shit as the net effect is fewer abused kids than if such images were to be legal. Notably you are free to fantasize about whatever you like its the actual creation and sharing of images that would be illegal. Far from being a minority report hellscape its literally the present way things already are many places.

          • Petter1@lemm.ee
            link
            fedilink
            English
            arrow-up
            8
            ·
            10 hours ago

            Lol, how can you say that do confidently? How would you know that with fewer AI CP you get less abused kids? And what is the logic behind it?

            Demand doesn’t really drop if something is illegal (same goes for drugs). The only thing you reduce is offering, which just resulting in making the thing that got illegal more valuable (this wakes attention of shady money grabbers that hate regulation / give a shit about law enforcement and therefore do illegal stuff to get money) and that you have to pay a shitton of government money maintaining all the prisons.

      • MoonlightFox@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        4
        ·
        15 hours ago

        Good arguments. I think I am convinced that both cases should be illegal.

        If the pictures are real they probably increase demand, which is harmful. If the person knew, then the action therefore should result in jail and forced therapy.

        If the pictures are not, forced therapy is probably the best option.

        So I guess it being illegal in most cases simply to force therapy is the way to go. Even if it in one case is “victimless”. If they don’t manage to plausibly seem rehabilitated by professionals, then jail time for them.

        I would assume (but don’t really know) most pedophiles don’t truly want to act on it, and don’t want to have those urges. And would voluntarily go to therapy.

        Which is why I am convinced prevention is the way to go. Not sacrificing privacy. In Norway we have anonymous ways for pedophiles to seek help. There have been posters and ads for it a lot of places a year back or something. I have not researched how it works in practice though.