• @[email protected]
      link
      fedilink
      522 months ago

      God that would sound so dystopian and futuristic…but to be honest, most articles about AI today would sound like that back then. Damn people would freak out about privacy.

  • @[email protected]
    link
    fedilink
    1372 months ago

    So, it’s like folding@home, but instead of donating your spare compute to science, you sell it to generate porn?

  • @[email protected]
    link
    fedilink
    932 months ago

    So… this AI company gets gaming teens to “donate” their computing power, rather than pay for render farms / GPU clouds?

    And then oblivious parents pay the power bills, effectively covering the computing costs of the AI porn company?

    Sounds completely ethical to me /s.

    • @[email protected]
      link
      fedilink
      English
      92 months ago

      No no, they’re getting copies of digital images out of it. It’s a totally fair trade!

  • @[email protected]
    link
    fedilink
    English
    852 months ago

    I’ll be a minority voice considering the other comments. But maybe just pay for onlyfans or whatever you guys use. I’m a generally attractive woman (I can surmise from interactions while trying to date) and I really don’t like the idea that my likeness would be used for something like this. Get your jollies off, but try and be a bit consensual about it. Is that so much to ask?

    • @[email protected]
      link
      fedilink
      67
      edit-2
      2 months ago

      It isn’t too much to ask. According to Dr. K of HealthyGamerGG (Harvard Psychiatrist/Instructor), research shows that the release of non-consensual porn makes the unwilling subjects suicidal over half the time. Non-consensual porn = deepfakes, revenge porn, etc. It’s seriously harmful, and there are other effects like depression, shame, PTSD, anxiety, and so on. There is functionally unlimited porn out there that is made with consent, and if someone doesn’t want to be publicly sexually explicit then that’s their choice.

      I’m not against AI porn in general (I consider it the modern version of dirty drawings/cartoons), but when it comes to specific likenesses as with deepfakes then there’s clear proof of harm and that’s enough for me to oppose it. I don’t believe there’s some inherent right to see specific people naked against their will.

      • @[email protected]
        link
        fedilink
        English
        102 months ago

        I think it would be too big of a privacy overreach to try to ban it outright as I think what people do on their own computers is their own business and there’s no way to enforce a full ban without being incredibly intrusive, but as soon as it gets distributed in any way I think it should be prosecuted as heavily as real non consensual porn that was taken against someone’s will.

      • @[email protected]
        link
        fedilink
        0
        edit-2
        2 months ago

        I wonder if part of the emotional risk is due to the general social stigma attached to porn. It becomes something that has to be explained and justified.

        If done to grand excess, deepfakes could crash the market on that, so to speak. Yeah, everyone saw your face on an AI-generated video. They also saw Ruth Bader Ginsburg, their Aunt Matilda, and for good measure, Barry Bonds, and that was just a typical Thursday.

        The shock value is burnt through, and “I got deepfaked” ends with a social stigma on the level of “I got in a shouting match with a cashier” or “I stumbled into work an hour late recently.”

        • @[email protected]
          link
          fedilink
          English
          22 months ago

          My main concern is for kids and teenagers. They’ll bully people for no damn reason at all and AI porn allows for bullies to do more fucked up psychological abuse, and that could be made much worse if victims have no recourse to fight back.

    • @[email protected]
      link
      fedilink
      English
      272 months ago

      I think the key is a lot of people don’t want to pay for porn. And in the case of deep fakes, it’s stuff they literally cannot pay money to get.

    • MuchPineapples
      link
      fedilink
      202 months ago

      Ai porn isn’t deepfake porn. The default is just a random ai generated face and body. Unless you want to it’s difficult to deepfake someone.

      • prole
        link
        fedilink
        English
        22 months ago

        Their photos are still unwittingly being used as training data.

            • @[email protected]
              link
              fedilink
              62 months ago

              You can’t just say “excellent question” when someone asks you to clarify your point lmfao

              “They’re trying to force our kids to get vaccines so they can manipulate them with 5g wifi”

              How could they manipulate your kids with 5g signals?

              “That’s a good question innit”

              • prole
                link
                fedilink
                English
                12 months ago

                I guess this was just one level of abstraction too much for you huh?

                The entire issue here is AI being trained on people’s data without them knowing or giving permission. The question of who’s likenesses and which photos are being used is an excellent question and it’s a big part of the problem here.

    • @[email protected]
      link
      fedilink
      122 months ago

      So I’m not disagreeing with you, but you’re assuming they’re making deepfake images, and the article doesn’t specify that. In fact I’d bet that it’s just AI generated “people” that don’t exist.

      What about AI porn of a person that doesn’t exist?

      • @[email protected]
        link
        fedilink
        252 months ago

        However, one of Salad’s clients is CivitAi, a platform for sharing AI generated images which has previously been investigated by 404 media. It found that the service hosts image generating AI models of specific people, whose image can then be combined with pornographic AI models to generate non-consensual sexual images.

    • @[email protected]
      link
      fedilink
      72 months ago

      I know someone who’s into really dark romance stuff, like really hardcore stuff, but she’d never do some of this due to safety reasons. I can totally see her generating scenes of herself in those situations.

    • VaultBoyNewVegas
      link
      fedilink
      32 months ago

      Shouldn’t be but I’ve been down voted here for speaking against deepfakes. Some people really don’t want to see the problem with them.

    • @[email protected]
      link
      fedilink
      English
      02 months ago

      I have a question and I hope that people here will discuss this because I really want to understand the general opinion on this.

      Is it wrong to deepfake someone without their consent so long as you don’t share the content and it’s all stored locally? I’ve seen this come up and my general opinion is that it isn’t. I know that isn’t the case in the article, just want to hear why people would disagree.

      My angle is that doing a deepfake of someone in private hurts zero people and is an extension of fantasy. I don’t see the creation of fake nudes any different than writing fantasy erotica about someone. And I also don’t see it as different than creating fake nude art of them by hand or with photoshop. Like if you do it in your head anyways, which is completely normal, then aren’t we just worried about the outside effects and not the fantasizing itself?

      • William
        link
        fedilink
        132 months ago

        It’s at least as wrong as fantasizing about them if they aren’t already romantically involved with you.

        How wrong that is, is up for debate. It will definitely creep them out and they can never find out about it.

        If it’s just in your head, at least there’s no physical way they could ever find out. You’d have to admit it. But if you have it on your hard drive, a hacker could get it and blackmail you with it, or just distribute it.

        So my stance is that there’s a non-zero chance of doing harm to them, and so it’s wrong. I wouldn’t do it. I also wouldn’t create it with Photoshop, or by hand, for the same reason.

        If you want to jerk off, do it to existing porn, or imaginary people porn. Don’t create porn of real people without their permission, even if you think nobody will ever see it other than you. Accidents happen, and they don’t deserve to bear the cost of that.

        • @[email protected]
          link
          fedilink
          English
          22 months ago

          You’d have to admit it. But if you have it on your hard drive, a hacker could get it and blackmail you with it, or just distribute it.

          There are lots of sick fucks that will distribute it themselves and even send it to their victims to harass them directly. It’s already happening.

          I don’t think it’s possible to ban it outright, and I think what people do on their own computer is their own business so long as they aren’t connecting to other computers, but we should have strong laws against distributing it and treat it the same as distributing secretly taken real nudes against someone’s will. Victims need recourse against harassment.

        • @[email protected]
          link
          fedilink
          English
          12 months ago

          The first part, absolutely. But I think a lot of that is biological so I don’t see fantasy as a problem. You should keep it to yourself though.

          The second part I think Id somewhat agree with except the hacker can’t blackmail you with it because it’s just as likely that they created it. And even if they did blackmail you, I would view that as the damage caused by the hacker, not by the individual.

          Like if someone put something nasty about me down in their diary where they expected it to be private, and a hacker sent me an email of that diary page, that’s entirely the hackers fault. The diary writer was expressing an emotion or desire or whatever in complete privacy. Was their creation wrong? No, I don’t think so.

          And to be clear I’m not saying people should go to this type of fantasy, this is all a thought exercise for ethics, but I think a lot about this stuff because as much potential for bad as it has, it also has some potential for good. All of the women I know experience behaviors such as stalking, obsession, unwelcome sexual advances, etc. on a regular basis. There is a reason those men don’t turn to free porn. Incel behavior is also just as bad in many ways. So could AI and deepfake stuff result in many of those men keeping that stuff to themselves more? Maybe.

          And before you say that these perverts will just send fake nudes to you and harass you that way, we should absolutely be prosecuting people that do so. That’s an entirely separate convo tho.

        • @[email protected]
          link
          fedilink
          02 months ago

          It will definitely creep them out and they can never find out about it.

          And that’s all that’s required for it to be considered wrong IMO.

            • @[email protected]
              link
              fedilink
              -12 months ago

              How anyone could think that going so far as to invoke thoughtcrime is relevant in this discussion is beyond me. It should be self evident to anyone that fantasies are a thing. They’ve been a thing for the entire history of the human race. In no way do fantasies compare to creating reproducible and sharable media of someone in a pornographic situation without their consent.

              You can’t transplant your fantasies into someone else’s head. Your fantasies literally cannot hurt anyone. On the other hand, imagine if you found out that someone was distributing pornographic material depicting one of your loved ones. It can quite literally ruin someone’s reputation to be seen in a pornographic situation.

              Your argument is some slippery slope fallacy shit.

              • @[email protected]
                link
                fedilink
                3
                edit-2
                2 months ago

                Reread the comment I replied to and then reread my comment. You are putting words in my mouth. I never mentioned anything about sharing anything nor implied anything of the sort.

    • oozynozh
      link
      fedilink
      -22 months ago

      Deepfake pornography is super goony but if I had to look for a silver lining, at least nobody had to undergo the actual physical degradation of making porn. It’s still gross in its own way, but it’s a different kind of gross that seems worse in some ways but better in others.

      I don’t know… Am I off base here?

          • @[email protected]
            link
            fedilink
            62 months ago

            Ah, right, sorry. The first part of your comment makes it seem like you’re leaning the other way.

            • oozynozh
              link
              fedilink
              52 months ago

              I’m not sure if I feel strongly enough about it to have a consequential opinion either way but I’m trying to at least judge the situation objectively.

              I think you raised a valid point. The non-consensual nature of deepfakes pushes it into the realm of abuse material and maybe that’s worse overall than the general exploitation of women going on in the adult film industry, even if those are supposed to be “consensual” on paper.

      • William
        link
        fedilink
        62 months ago

        Judging by another comment here, non-consensual porn is far worse, and causing suicidal thoughts and more.

        So I’d say it has all the “gross” of regular porn (which is subjective) and the additional “gross and horrifying” of violating someone.

    • @[email protected]
      link
      fedilink
      -42 months ago

      Or…just go out and meet people? Onlyfans just enables perversity to keep spreading and ruining our society.

  • @[email protected]
    link
    fedilink
    402 months ago

    If I’m reading this right, it’s a program that users sign up for to donate their processing power (and can opt in or out of adult content), which is then used by client companies to generate their own users’ content? It even says that Salad can’t view or moderate the images, so what exactly are they doing wrong besides providing service to potentially questionable companies? It makes as much sense as blaming Nvidia or Microsoft, am I missing something?

    • Cethin
      link
      fedilink
      English
      242 months ago

      Based on the rewards, I’m assuming it’s being done by very young people. Presumably the value of rewards is really low, but these kids haven’t done the cost-benefit analysis. If I had to guess, for the vast majority it costs more in electricity than they get back, but the parents don’t know it’s happening.

      This could be totally wrong. I haven’t looked into it. This is how most of these things work though. They prey on the youth and their desire for these products to take advantage of them.

      • @[email protected]
        link
        fedilink
        English
        52 months ago

        Honestly what roblox kids are willing to do for pitiful pay is scary, if you work in any kind of creative digital medium those kids will do days of your job for a fiver if any real money at all. It won’t be industry quality or anything but damn we got a whole digital version of sending kids down the mines. (And some of these roblox games can have unexpectedly big players behind them exploiting kids)

      • @[email protected]
        link
        fedilink
        32 months ago

        Right, so it’s not like they’re being tricked into generating porn or anything. It’s not some option that they would have turned off if they’d known about it, they just don’t care what’s happening because they only want the reward. Again I’m not saying I agree with it or that Salad’s right to do it, but if they say that’s potentially what it can be used for (and they do because the opt-out is available) then the focus should be on the client companies using the tool for questionable purposes.

    • @[email protected]
      link
      fedilink
      English
      72 months ago

      so what exactly are they doing wrong besides providing service to potentially questionable companies?

      Well I think that is the main point of what is wrong. I think the big question is whether the mature content toggle is on by default or not. The company says it’s off, but some users said otherwise. Dunno why the author didn’t install it and check.

      • @[email protected]
        link
        fedilink
        72 months ago

        They said they did.

        However, by default the software settings opt users into generating adult content. An option exists to “configure workload types manually” which enables users to uncheck the “Adult Content Workloads” option (via 404 media), however this is easily missed in the setup process, which I duly tested for myself to confirm.

        Honestly, and I’m not saying I support what’s being done here, the way I see it if you’re tech savvy enough to be interested in using a program like this you should be looking through all of the options properly anyway. If users don’t care what they’re doing and are only interested in the rewards that’s kind of on them.

        I just think the article is focused on the wrong company, Salad is selling a tool that is being potentially misused by users of their client’s service. I can certainly see why that can be a problem, but based on the information given in the article I don’t think it’s really theirs. If that’s ALL Salad’s used for then that’s a different story.

        • @[email protected]
          link
          fedilink
          English
          12 months ago

          Ah thanks I think I forgot that sentence by the end of the article and thought it was just a user report that it was checked by default. I really don’t think that it should be checked by default, depending on where you are it could even get you in trouble. App setup for this kind of stuff isn’t necessarily only for power users now, it has gotten very streamlined and tested for conversion.

    • @foo
      link
      32 months ago

      Imagine collecting the smartest people on the planet from 100 years ago and explaining this.

  • @[email protected]
    link
    fedilink
    272 months ago

    I kinda fail to see the problem. The GPU owner doesn’t see what workload they are processing. The pr0n company is willing to pay for GPU power. The GPU owner wants to earn money with his hardware. There’s a demand, there’s an offer, nobody is getting hurt (ai pr0n is not illegal, at least for now) so let people what they want to do

    • @[email protected]
      link
      fedilink
      142 months ago

      The problem is that they are clearly targeting minors who don’t pay their own electricity bill, and dont even neccessarily have awareness that they are paying for their fortnite skins with their parents money. Also: there is a good chance that the generated pictures are at some point present on in the filesystem of the generating computer, and that alone is a giant can of worms that can even lead to legal troubles, if the person lives in a country where some or all kinds of pronography are illegal.

      This is a shitty grift, abusing people who don’t understand the consequences of the software.

      • @[email protected]
        link
        fedilink
        32 months ago

        Agreed. Preying on children who don’t understand what they’re signing up for is shitty to begin with.

        Then, add that deepfake AI porn is unethical and likely illegal (and who knows what other kinds of potentially-illegal images are being generated…)

        And, as you point out, the files having existed in the computer could, alone, be illegal.

        Then, as and extra fuck you, burning GPU cycles to make AI images is causing CO2 emissions, GPU wear, waste heat that might trigger AC, and other negative externalities too, I’m sure…

        It’s shit all around.

    • SUPAVILLAIN
      link
      fedilink
      1
      edit-2
      2 months ago

      Because most ai-generated pornography models are trained off actual nudes scraped off the internet; and not just those who work in the corporate porn industry. This essentially falls under the same morality as nonconsensual/revenge porn by allowing all and sundry to generate images off images the original posters never were polled for consent for.

      But I forgot, this comm is plagued with treathounds that meatspace kink communities would throw out for a rule 3 breach; so I don’t know why I’m inconveniencing the electrons to explain something that even the terminally-pornbrained should be able to comprehend…

  • @[email protected]
    link
    fedilink
    142 months ago

    Great. Now we’re trading pre-made traditional artwork to kids in exchange for fresh robot porn!

    • @[email protected]
      link
      fedilink
      162 months ago

      I’d rather have a wealth of new porn around rather than thousands random Blockchains going around.

      At least the porn will probably be useful for someone long term haha

    • Jojo, Lady of the West
      link
      fedilink
      232 months ago

      On its own, it’s just the same as hate for porn. But there’s also deep fake porn, ai porn of real people, and that’s potentially far more problematic.

        • Jojo, Lady of the West
          link
          fedilink
          -12 months ago

          Do you hate all amateur art, or just when it’s made with ai tools? Does a kid’s drawing, produced in scant seconds and with no training and remarkably little skill hold negative value to you, or is it worth something?

          What about art produced with hours or days of effort and a specific goal in mind, but don’t so using primarily ai with perhaps a few finishing touches?

          • @[email protected]
            link
            fedilink
            6
            edit-2
            2 months ago

            I love it when people get hyper defensive about this for no reason at all. Aesthetically, AI art is obviously better than a child’s scribbles, but the problem is that AI art is pure aesthetic, with no meaning behind it at all, and if you engage with art purely for the aesthetic, then you fundamentally miss the point of it. AI can’t mean anything when it produces art. It just spits out a series of 1s and 0s based on whatever nonsense you shout into it.

            It doesn’t matter how many hours you spend working on a piece, if you use AI (Edit to clarify: if you use AI to generate the art in its entirety), then the AI made the art. An AI cannot answer questions about artistic decisions it made, because it made no decisions. It’s worse than tracing—at least an amateur artist can answer why they decided to copy another artist’s work.

            Because charitable interpretation is dead, I have to clarify. I’m not saying that there is no valid use case AI generated art, nor am I saying that all human-made art is good. All I’m saying is that human-made art can have meaning behind it, while AI art cannot. It’s incapable of having meaning, so it isn’t really art.

            • Jojo, Lady of the West
              link
              fedilink
              1
              edit-2
              2 months ago

              It doesn’t matter how many hours you spend working on a piece, if you use AI, then the AI made the art.

              Except that artists can use ai as a tool to make art. Sure, the ai can’t say why that pixel looks that way, but the artist can say why this is the output that was kept. They can tell you why they chose to prompt the ai the way they did, what outputs they expected and why the ones that were kept were special, let alone what changes they may have made after and why.

              If Jackson Pollock can make art from randomness by flicking a brush, why can’t someone make art from randomness by promoting an ai? Is there a lone somewhere that makes it become art, in your opinion? I don’t think it would be uncharitable by interpreting the above quote to mean you don’t believe it is possible at all to use ai as a tool in the production of the art.

              If ai is the only tool used, it never makes an image, let alone art, because there was never even a human using language to prompt the ai. But from that obviously ridiculous extreme there is certainly a long spectrum ranging through what I described above to something as far removed as a human generating landscapes for a storyboard before fully producing a movie that doesn’t include the air outputs in any physical way. I’m sure you would claim a line exists between there, and I’m curious where.

              • @[email protected]
                link
                fedilink
                2
                edit-2
                2 months ago

                There’s a couple of orthogonal arguments here, and I’m going to try to address them both: are you an artist if you use AI generated art, and why do I hate AI generated art?

                Telling a machine “car, sedan, neon lights, raining, shining asphalt, night time, city lights” is not creating art. To me, it’s equivalent to commissioning art. If I pay someone $25 to draw my D&D character, then I am not an artist, I’ve simply hired one to draw what I wanted to see. Now, if I make any meaningful changes to that artwork, I could be considered an artist. For example, if I commissioned someone else to do the line work, and then I fill in the colors, we’ve both made the artwork. Of course, this can be stretched to an extreme that challenges my descriptivism. If I put a single black pixel on the Mona Lisa, can I say I collaborated on the output? Technically, yes, but I can’t take credit for anything other than putting a black pixel on it. Similarly, I feel that prompt engineers can’t take any credit for the pictures that AI produces past the prompt that they provided and whatever post-processing they do.

                As for why I hate AI art, I just hate effortless slop. It’s the exact same thing as YouTube shorts comprised of Family Guy clips and slime. I have a hard time really communicating this feeling to other people, but I know many other people feel the same way. Even aside from the ethical concerns of stealing people’s artwork to train image generators, we live in a capitalist society, and automating things like art generation and youtube shorts uploads harms the people who actually produce those things in the first place.

                • Jojo, Lady of the West
                  link
                  fedilink
                  12 months ago

                  Telling a machine “car, sedan, neon lights, raining, shining asphalt, night time, city lights” is not creating art. To me, it’s equivalent to commissioning art.

                  When art is commissioned, art is produced. If no human produced it, an ai did. If ai cannot produce art, then a human must have.

                  Similarly, I feel that prompt engineers can’t take any credit for the pictures that AI produces past the prompt that they provided and whatever post-processing they do.

                  I suppose I don’t understand why engineering a prompt can’t count as an artistic skill, nor why selecting from a number of generated outputs can’t (albeit to probably a much lower degree). At what point does a patron making a commission become a collaborator? And if ai fills the role of the painter, why wouldn’t you expect that line to move?

                  As for why I hate AI art, I just hate effortless slop.

                  I’m with you there. And I would brook no issue with completing about the massive amount of terrible, low-effort ai art currently being produced. But broadening the claim to include all art in which the most efficacious tool used was ai pushes it over the line for me.

          • SUPAVILLAIN
            link
            fedilink
            3
            edit-2
            2 months ago

            Whataboutism and JAQing off. AI models are trained off blatant mass theft; as long as the originators of the training material (1) haven’t given consent to their being scraped and (2) aren’t getting paid for said already-done scraping, then the generator is unethical and deserving of hatred. You can’t have it both ways-- if capitalism is the game that must be played, then the originators of the training data need to give their consent and they need to be paid for every byte of training data that’s been stolen from them.

            • Jojo, Lady of the West
              link
              fedilink
              12 months ago

              Hours of effort to create prompts to maneuver the models output until it looks closer to what you wanted, possibly with the addition of touch-up or addition steps at the end likely needed for certain kinds of image to clean up things the ai struggles with (like, say, hands) or to add something in particular the ai didn’t understand (like, say, a monster of your own invention or something).

              It’s easy to say that doesn’t count, that the prompt engineer could have just come up with their final prompt in the first place, but then does it count when a digital painter sketches an outline a dozen times before deciding it’s where they want it? After all, the digital artist could have just drawn it the way they wanted at first blush. But I’d bet you’ll say the time the digital artist spent “counts” as time spent working on an art piece, even if you might be inclined to say the prompt engineer’s time doesn’t. I’d be interested to hear your take.

              • @[email protected]
                link
                fedilink
                12 months ago

                Dude, I don’t care how many iterations a person goes through. I care that the piece contains a bit of their soul.

                The argument you’re making fails to appreciate why two images, one made by gen AI, one by a real human person, both exactly identical pixel by pixel, could possibly be valued differently.

                If you want to know why I seem to lack respect for the prompt artist who spends a 3-month chunk of their life toiling over their latest piece, making everything just so, because some part of them desperately needs to say something and this piece is the only way they can—I would ask you to show me one.

                But further, the prompt artist doesn’t even make it. Even if they did spend the time, credit goes to the AI. The prompt artist is welcome to claim their prompt, I guess, but I don’t often see them sharing those around. Would that even be entertaining?

                • Jojo, Lady of the West
                  link
                  fedilink
                  12 months ago

                  Dude, I don’t care how many iterations a person goes through. I care that the piece contains a bit of their soul.

                  the prompt artist who spends a 3-month chunk of their life toiling over their latest piece,

                  I’m curious what could possibly convince you that someone put their soul into their work? Or why the assumption is always that ai is the only tool being used.

                  Here’s a list of artists using ai tools in their work.

                  But further, the prompt artist doesn’t even make it.

                  Again, ai is a tool. That’s like saying digital artists didn’t make their paintings, the printer did. Or maybe it’s like saying the director didn’t make the movie, the actors and cameras did. Actually, I really like the director analogy. They give directions to the actors as many times as they need to get the take they want, and then they finalize it later with post production.

      • @[email protected]
        link
        fedilink
        22 months ago

        But that’s the same issue of making fakes that we’ve had for 30+ years since digital manipulation became feasible.

        • @foo
          link
          13
          edit-2
          2 months ago

          Yeah sure except now to make deep fake porn you just need to go ‘famous star naked riding an old man’s cock’ set 8 images for each seed and set a job of 100 images, turn the air con to antarctic and make misogynistic videos about why movies are woke while the job slowly cooks your studio

          Then when you finish you probably have some good images of whatever famous star you like getting railed by an old man and you can hop on YouTube and complain that people don’t think you are an artist.

          It requires almost no effort or talent to make a boatload of deep fake material. If you put any effort in you can orchestrate an image that looks pretty good.

          • Jojo, Lady of the West
            link
            fedilink
            92 months ago

            Add to that the fact that before ai, unless you’re already pretty famous, no one cares enough to make nonconsensual porn of you. After, anyone vaguely attracted to you can snap or find a few pictures and do a decent job of it without any skill or practice.

          • @[email protected]
            link
            fedilink
            22 months ago

            Ease of creation shouldn’t have a bearing on whether or not the final result is illegal. A handmade vs AI generated fake nude should be treated the same way.

            • @foo
              link
              72 months ago

              I didn’t argue that it shouldn’t. The difference is the ease of creation. It now requires no skill or talent to produce it so the game has changed and it needs to be addressed and not dismissed

        • Jojo, Lady of the West
          link
          fedilink
          12 months ago

          Well, the word deep fake is literally from the ai boom, but I understand you to mean doctored images to make it look like someone was doing a porn when they didn’t was already a thing.

          And yeah, it very much was. But unless you were already a high profile individual like a popular celebrity, or mayyybe if you happened to be attractive to the one guy making them, they didn’t tend to get made of you, and certainly not well. Now, anyone with a crush and a photo of you can make your face and a pretty decent approximation of your naked body move around and make noises while doing the nasty. And they can do it many orders of magnitude faster and with less skill than before.

          So no, you don’t need ai for it to exist and be somewhat problematic, but ai makes it much more problematic.

    • @[email protected]
      link
      fedilink
      English
      42 months ago

      One ethics quandary is AI child porn. It at least provides a non-harmful outlet for an otherwise harmful act, but it could also feed addictions and feel insufficient.