cross-posted from: https://lemmy.zip/post/15863526

Steven Anderegg allegedly used the Stable Diffusion AI model to generate photos; if convicted, he could face up to 70 years in prison

    • PoliticalAgitator@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      edit-2
      6 months ago

      Because they are images of children being graphically raped, a form of abuse. Is an AI generated picture of a tree not a picture of a tree?

      • Daxtron2@startrek.website
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        edit-2
        6 months ago

        No it isn’t, not anymore than a drawing of a car is a real car, or drawings of money are real money.

        • laughterlaughter@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          Nobody is saying they’re real, and I now see what you’re saying.

          By your answers, your question is more “at-face-value” than people assume:

          You are asking:

          “Did violence occur in real life in order to produce this violent picture?”

          The answer is, of course, no.

          But people are interpreting it as:

          “This is a picture of a man being stoned to death. Is this picture violent, if no violence took place in real life?”

          To which answer is, yes.

            • laughterlaughter@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              6 months ago

              We’re not disagreeing.

              The question was:

              “Is this an abuse image if it was generated?”

              Yes, it is an abuse image.

              Is it actual abuse? Of course not.

                • PoliticalAgitator@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  edit-2
                  6 months ago

                  Images of children being raped are being treated as images of children being raped. Nobody has every been caught with child pornography and charged as if they abused the children themselves, nor is anybody advocating that people generating AI child pornography are charged as if they sexually abused a child.

                  Everything is being treated as it always has been, but you’re here arguing that it’s moral and harmless as long as an AI does it, using every semantic trick and shifted goalpost you possibly can.

                  It’s been gross as fuck to watch. I know you’re aiming for a kind of “king of rationality, capable of transcending even your disgust of child abuse” thing, but every argument you make is so trivial and unimportant that you’re coming across as someone hoping CSAM becomes more accessible.

                • laughterlaughter@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  2
                  ·
                  6 months ago

                  Well, that’s another story. I just answered your question. “Are these images about abuse even if they’re generated?” Yup, they are.

                  “Should people be prosecuted because of them?” Welp, someone with more expertise should answer this. Not me.

                • PoliticalAgitator@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  6 months ago

                  You’ve already fucked up your own argument. You’re supposed to be insisting there’s no such thing as a “violent video game”, because representations of violence don’t count, only violence done to actual people.

        • Steal Wool@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 months ago

          Oops you forgot to use logic. As per the comment you’re replying to, the more apt analogy would be: is an AI generated picture of a car still a picture of a car.

          • Daxtron2@startrek.website
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            6 months ago

            That has nothing to do with logic? Its pointing out that both drawings and AI gens are not really the things they might depict

      • Leg@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        2
        ·
        6 months ago

        It’s a picture of a hallucination of a tree. Distinguishing real from unreal ought to be taken more seriously given the direction technology is moving.

        • PoliticalAgitator@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 months ago

          It’s a picture of a hallucination of a tree

          So yes, it’s a tree. It’s a tree that might not exist, but it’s still a picture of a tree.

          You can’t have an image of a child being raped – regardless of if that child exists or not – that is not CSAM because it’s an image of a child being sexually abused.

          Distinguishing real from unreal ought to be taken more seriously given the direction technology is moving.

          Okay, so who are you volunteering to go through an endless stream of images and videos of children being raped to verify that each one has been generated by an AI and not a scumbag with a camera? Peados?

          Why are neckbeards so enthusiastic about dying on this hill? They seem more upset that there’s something they’re not allowed to jerk off to than by the actual abuse of children.

          Functionally, legalising AI generated CSAM means legalising “genuine” CSAM because it will be impossible to distinguish the two, especially as paedophiles dump their pre-AI collections or feed them in as training data.

          People who do this are reprehensible, no matter what hair splitting and semantic gymnastics they employ.

          • Leg@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            6 months ago

            Hey man, I’m not the one. I’m literally just saying that the images that AI creates are not real. If you’re going to argue that they are, you’re simply wrong. Should these ones be generated? Obviously I’d prefer that they not be. But they’re still effectively fabrications that I’m better off simply not knowing about.

            If you want to get into the weeds and discuss the logistics of enforcing what is essentially thought crime, that is a different discussion I’m frankly not savvy enough to have here. I have no control over the ultimate outcome, but for what it’s worth, my money says thought crime will in fact become a punishable offense within our lifetimes, and this may well be an easy catalyst to use to that end. This should put your mind at ease.

            • PoliticalAgitator@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              6 months ago

              The thread is about “how are they abuse images if no abuse took place” and the answer is “because they’re images of abuse”. I haven’t claimed they’re real at any point.

              It’s not a thought crime because it’s not a thought. Nobody is being charged for thinking about raping children, they’re being charged for creating images of children being raped.

              • Leg@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                6 months ago

                If the images are generated and held by a single person, it may as well be a thought crime. If I draw a picture of a man killing an animal, which is an image depicting a heinous crime spawned by my imagination, and I go to prison over this image, I would consider this a crime of incorrect thought. There are no victims, no animals are harmed, but my will spawned an image of a harmed animal. Authorities dictated I am not allowed to imagine this scenario. I am punished for it. I understand that the expression of said thought is what’s being punished, but that is very literally the only way to punish a thought to begin with (for now), hence freedom of expression being a protected right.

                The reason this is a hard issue to discuss in this context is because the topic at hand is visceral and charged. No one wants to be caught dead defending the rights of a monster, lest they be labeled a monster themselves. I see this as a failure of society to know what to do about people like this, opting instead to throw them into a box and hope they die there. If our justice system wasn’t so broken, I might give less of a shit, but as it stands I see this response as shortsighted and inhumane.

    • sxt@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      8
      ·
      6 months ago

      If the model was trained on csam then it is dependent on abuse

        • mindbleach@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          6 months ago

          You’d figure “CSAM” was clear enough. You’d really figure. But apparently we could specify “PECR” for “photographic evidence of child rape” and people would still insist “he drew PECR!” Nope. Can’t. Try again.

          • Daxtron2@startrek.website
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            6 months ago

            Ever moving goal posts. Ever notice how the ones who cry “for the children” the most seemingly have the most to hide?

    • laughterlaughter@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      6 months ago

      I mean… regardless of your moral point of view, you should be able to answer that yourself. Here’s an analogy: suppose I draw a picture of a man murdering a dog. It’s an animal abuse image, even though no actual animal abuse took place.

        • laughterlaughter@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          6 months ago

          Except that it is an animal abuse image, drawing, painting, fiddle, whatever you want to call it. It’s still the depiction of animal abuse.

          Same with child abuse, rape, torture, killing or beating.

          Now, I know what you mean by your question. You’re trying to establish that the image/drawing/painting/scribble is harmless because no actual living being suffering happened. But that doesn’t mean that they don’t depict it.

          Again, I’m seeing this from a very practical point of view. However you see these images through the lens of your own morals or points of view, that’s a totally different thing.

            • laughterlaughter@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              6 months ago

              No, they’re violent films.

              Snuff is a different thing, because it’s supposed to be real. Snuff films depict violence in a very real sense. So so they’re violent. Fiction films also depict violence. And so they’re violent too. It’s just that they’re not about real violence.

              I guess what you’re really trying to say is that “Generated abuse images are not real abuse images.” I agree with that.

              But at face value, “Generated abuse images are not abuse images” is incorrect.

        • Jimmyeatsausage@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          arrow-down
          3
          ·
          6 months ago

          It’s not child sexual assault if there was no abuse. However, the legal definition of csam is any visual depiction, including computer or computer-generated images of sexually explicit conduct, where […]— (A) the production of such visual depiction involves the use of a minor engaging in sexually explicit conduct; (B) such visual depiction is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct; or © such visual depiction has been created, adapted, or modified to appear that an identifiable minor is engaging in sexually explicit conduct.

          You may not agree with that definition, but even simulated images that look like kids engaging in sexual activity meet the threshold for CSAM.

          • JackGreenEarth@lemm.ee
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            4
            ·
            6 months ago

            Do you not know that CSAM is an acronym that stands for child sexual abuse material?

            • Possibly linux@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              4
              ·
              6 months ago

              True but CSAM is anything that involves minors. Its really up to the court to decide a lot of it but in the case above I’d imagine that the images were quite disturbing.

            • Possibly linux@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              6 months ago

              I think the court looked at the phycological aspects of it. When you look at that kind of material you are training your brain and body to be attracted to that stuff in real life.

      • Reddfugee42@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        6 months ago

        We’re discussing the underpinnings and philosophy of the legality and your comment is simply “it is illegal”

        I can only draw from this that your morality is based on laws instead of vice versa.

        • Possibly linux@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 months ago

          I’m in the camp if that there is no reason that you should have that kind of imagery especially AI generated imagery. Think about what people often do with pornography. You do not want them doing that with children regardless of if it is AI generated.

          • Reddfugee42@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            What does want have to do with it? I’d rather trust science and psychologists to determine if this, which is objectively harmless, helps them control their feelings and gives them a harmless outlet.

            • Possibly linux@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 months ago

              They aren’t banning porn in general. They just don’t want to create any more sexual desires toward children. The CSAM laws came from child protection experts. Admittedly some of these people want to “ban” encryption but that’s irrelevant in this case.

      • mindbleach@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 months ago

        There was no C.

        There was no SA.

        The entire point of saying “CSAM” was to distinguish evidence of child rape from depictions of imaginary events.

  • Lowlee Kun@feddit.de
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    2
    ·
    6 months ago

    13.000 images are generated relatively fast. My PC needs like 5 seconds for a picture with SD(depending on settings of course). So not even a day.

    Also, if pedos would only create their own shit to fap to i would consider this a win.

    • kamenoko@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      11
      ·
      6 months ago

      The only good pedo is a pedo permanently separated from society. Let’s start with the Catholic Church

        • andrew_bidlaw@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          4
          ·
          6 months ago

          Sure.

          I mostly referred to the second paragraph. Probably, this person meant that it’s better no child has been harmed in this 13k images’ production, but the wording irked me, especially the ‘win’. It got me a bit salty and I didn’t elaborate, so I don’t know what exactly people thought I’ve meant.

          So I don’t consider this a ‘win’ because it doesn’t help their urges or make them less dangerous unlike therapy, like vaping was sometimes marketed as a healthier alternative to cigs or a way to give up smoking. I don’t want to dive into ethics of these two kinds of CSAM, but I find that leaving out the aspect of production (victimless?), it’s still harmful to the society as a whole to (generate,) collect and share it. Why brackets? Usually in court there are different levels or different articles that may be involved, and if production itself may be treated as harmless, merely having a collection and participating in trade\share of such materials are criminal offences themselves. And there we are to pick if we treat them as real or not. Returning to vapes: due to not being regular cigs, when it was a novelty many initially thought it’s okay to smoke them at work or in a classroom, but later they were banned as well. That’s not the only case where the nature of what AI produces and responsibility for that causes arguments, and our codified laws aren’t all bleeding edge to cover this, so I guess we are in the time we decide the framework to evaluate, work with them. And as silly as it is, vape pandemic was the first thing I’ve been reminded of, and it’s not great because both this and AI CSAM I’ve heard of because of it’s usage in schools - the second one is an article from months ago about deepfake nudes boys made of peers. Seemingly gated garden keeps being the most vulnerable.

  • SuperSpruce@lemmy.zip
    link
    fedilink
    English
    arrow-up
    21
    ·
    6 months ago

    70 years for… Generating AI CSAM? So that’s apparently worse than actually raping multiple children?

    • Onihikage@beehaw.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 months ago

      He did more than generate it, he also sent some of it to a minor on Instagram, probably intending to get some real CSAM, or worse. For that, spending the next 70 years away from both children and computers seems appropriate to me.

        • Onihikage@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          6 months ago

          It’s not about punishing him, it’s about keeping a clear threat to children away from them for as long as is necessary. Maybe he can be rehabilitated, but I’d rather start with lifelong separation from their means and targets and go from there.

  • stevedidwhat_infosec@infosec.pub
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    6 months ago

    Sensitive topic - obviously.

    However these guard rail laws, and “won’t someone think about the children” cases are a reeeeally easy way for the government to remove more power from the people.

    However, I believe if handled correctly, banning this sort of thing is absolutely necessary to combat the mental illness that is pedophilia.

    • laughterlaughter@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 months ago

      I don’t condone child sexual abuse, and I’m definitely not a pedo (gosh, I can’t believe I have to state this.)

      But how does banning AI generated material help combating a mental illness? The mental illness will still be there, with or without images…

      • Leg@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 months ago

        There’s something to be said about making it as difficult as possible to enable the behavior. Though that does run the risk of a particularly frustrated individual doing something despicable to an actual child. I don’t exactly have the data on how all this plays out, and frankly I don’t want to be the one to look into it. Society isn’t particularly equipped to handle an issue like this though, focusing on stigma alone to kinda try to shove it under the rug.

        • laughterlaughter@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          6 months ago

          Your second sentence is exactly what I was thinking of. The big issue with pedophilia is the fact that kids can be easily manipulated (or forced!) to do heinous acts. Otherwise, what’s the difference with regular porn and topics about prisoners, slavery, necrophilia, etc? Would we say that people who consume rape fantasy porn will go out and rape? If a dude who is sexually attracted to women is not raping women left and right every day all year round, you know, because he knows it’s wrong, if we’re not labeling every heterosexual male as creeps, then why would this be different with other kinds of attractions?

          But anyway. I’m not saying anything that hasn’t been discussed in the past (I’m sure.) I’m just glad I don’t have that condition (or anything similar, like attracted to volcanoes), otherwise life would definitely suck.

      • stevedidwhat_infosec@infosec.pub
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        6 months ago

        Mainly it’s a problem of enabling the problem as others have mentioned.

        It’s not a solution, per se. It doesn’t solve something specifically- but it doesn’t have to be. It’s about making it less accessible, harsher consequences, and so on to put more pressure on not continuing to participate in the activity. Ultimately it boils down to mental health and trauma. Pedophilia is a paraphilic disorder at the end of the day.

        • laughterlaughter@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          6 months ago

          We don’t disagree. But this argument is different from the OP from what you stated earlier. Your current argument is “these images are horrible. Let’s wipe them out of the face of Earth because they’re wrong.”

          But OP (Edit: oops, OP is you!) originally said “not having access to these images will help people ‘cure’ their paraphilia.” I don’t think that has any scientific basis, though I’ll be happy to stand corrected.

          Edit: clarification.

          • stevedidwhat_infosec@infosec.pub
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 months ago

            I am the original commentator, unless you’re referring to the poster who just posted a quote and the link to the article

            I’m not sure where you’re drawing these argument conclusions from and it’s bordering on muddying the water.

            • laughterlaughter@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              6 months ago

              Sorry, yes, I was referring to what you originally said (I thought it was another commenter.)

              Well, the same thing I can say about your argument conclusions and the same “muddying the water” opinion.

              Your stance is “banning this X type of content will help cure Y,” and I’d like to see the science backing this up. That is all. I’m not defending pedophilia if that’s what you’re implying with “muddying the waters.” It’s just that I’m all for evidence, even if the evidence makes us (yes, me included) uncomfortable.

              • stevedidwhat_infosec@infosec.pub
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                2
                ·
                6 months ago

                I’ve literally just said what I meant and you’re ignoring it. I explicitly said that it’s about making it harder to participate the behavior. I even said it’s not a cure.

                Obvious troll. Blocked. See ya never edge lord

            • laughterlaughter@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              6 months ago

              I know the difference.

              I’ve used “OP” to refer to a parent poster (or commenter) for decades, on Slashdot, Digg, Reddit and now here. I won’t change it unless there’s a major shift in the community.

    • bjorney@lemmy.ca
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      1
      ·
      6 months ago

      It’s open source code that someone ran on their own computer, it’s not like he used paid OpenAI credits to generate the image.

      It also would set a bad precedent - it would be like charging Solomons & Fryhle because someone used their (absolutely ubiquitous) organic chemistry textbook to create methamphetamine

    • Demigodrick@lemmy.zip
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      7
      ·
      6 months ago

      Well the American way is not to hold the company accountable, I.e. school shootings, so yeah.

            • Possibly linux@lemmy.zip
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              2
              ·
              6 months ago

              Still can’t really hold them liable unless they deliberately sold a weapon to someone who legally was prohibited from having a weapon.

              Shooting are more of a mental health and social media issue in my mind. The bigger question is why did someone feel the need to kill others?

              • Demigodrick@lemmy.zip
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                3
                ·
                6 months ago

                Still can’t really hold them liable unless they deliberately sold a weapon to someone who legally was prohibited from having a weapon.

                That’s a very American point of view though - America isn’t holding those who create/sell tools that do bad things to account. If gun manufacturers were held responsible for how the things they created were used, you can bet anything suddenly they’d be hell of lot safer. Which is the exact same point about AI.

                (Obviously not holding manufacturers/sellers to account is not an America-only issue, but this article is about AI and the USA so that’s the example I’m using.)

                The bigger question is why did someone feel the need to kill others?

                As a non-American, I think the general question is why on earth does the general public need semi-automatic weapons. Or really, any weapons.

                • ArcaneSlime@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  arrow-down
                  2
                  ·
                  6 months ago

                  I mean we’re also not suing Toyota or Stolichnaya to stop drunk driving. In America the onus is on you not to do the bad thing, not on the companies or government for not preventing you from doing it. In America if you kill someone it is your fault, not Ruger’s.

                  Frankly I’m surprised it doesn’t work that way in every country, if you sell a friend your old car and he hits an old lady years or months later would you get charged? That sucks.

                • Leg@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  6 months ago

                  I see the gun issue in America in the same light as the car issue. We’re in way too fucking deep, and it’s a part of our culture now. I hate both, but I acknowledge how difficult it is to do something about it.

    • JokeDeity@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 months ago

      Just to be clear, you guys think that any company that produces anything that ends up used in a crime should have criminal charges for making the product? Yeah, makes about as much sense as anything these days.

    • jonne@infosec.pub
      link
      fedilink
      English
      arrow-up
      4
      ·
      6 months ago

      I think stable diffusion is an open source AI you can run on your own computer, so I don’t see how the developers should be held responsible for that.

  • HelixDab2@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    6 months ago

    The basis of making CSAM illegal was that minors are harmed during the production of the material. Prior to CG, the only way to produce pornographic images involving minors was to use real, flesh-and-blood minors. But if no minors are harmed to create CSAM, then what is the basis for making that CSAM illegal?

    Think of it this way: if I make a pencil drawing of a minor being sexually abused, should that be treated as though it is a criminal act? What if it’s just stick figures, and I’ve labeled one as being a minor, and the others as being adults? What if I produce real pornography using real adults, but used actors that appear to be underage, and I tell everyone that the actors were all underage so that people believe it’s CSAM?

    It seems to me that, rationally, things like this should only be illegal when real people are being harmed, and that when there is no harm, it should not be illegal. You can make an entirely reasonable argument that pornographic images created using a real person as the basis does cause harm to the person being so depicted. But if it’s not any real person?

    This seems like a very bad path to head down.