Rep. Joe Morelle, D.-N.Y., appeared with a New Jersey high school victim of nonconsensual sexually explicit deepfakes to discuss a bill stalled in the House.

  • @[email protected]
    link
    fedilink
    English
    3
    edit-2
    6 months ago

    Other than vague slippery slope fearmongering I don’t see how banning the creation and distribution of deepfake porn is going to make AI monopolized by corporations. If have your own personally trained and run AI model, you have complete control of what sort of content it’s generating. Why would you have issues with deepfake porn laws if you are not generating and hosting that content?

    It just doesn’t add up, there’s some logical leap here that seems almost on the level of conspiracy theories. As much as governments do tend to favor corporations over regular people there is nothing so far even vaguely suggesting that AI would be so profoundly restricted that only corporations could use it. In fact, what has been described of what is proposed so far does not target the technology at all, only the users who engage in this kind of bad conduct.

    But I profoundly disagree with this “nothing to be done about it”. How would fighting it be worse than letting people suffer for it? It’s not like drugs where the main person who might have issues is the user themselves, this affects unrelated vulnerable people.

    If it is identified who is making deepfake porn and where it’s being hosted, it can be taken down. You could argue that not every single responsible person will be identified, but it might still be enough to diminish the prevalence and number of victims. And to the point that the remaining ones will have to be sneaky about it, that still might lead to less harassment to the victims.

    You compare it to the war on drugs. Meanwhile I think of the rise of the automobile, with people crying that seat belts and traffic lights were ruining their freedom and “there’s nothing to be done” about people dying in car crashes.

    • @[email protected]
      link
      fedilink
      English
      16 months ago

      If everyone could create their own, and just run it locally, explain how the laws could be enforced?

      • @[email protected]
        link
        fedilink
        English
        4
        edit-2
        6 months ago

        Aguing that since you do a crime with a tool, outlawing the crime outlaws the tool is a bad argument. Outlawing murder doesn’t outlaw knives.

        As far as enforcement, it may be enforced with varying degrees of success but the argument that someone may get away with the crime also isn’t a reason not to make it a crime.

        If someone created deep fakes using locally run models, rubbed one out and then deleted everything they probably wouldn’t be caught…but largely who cares that they didn’t? It’s the harm to others that it causes that you would largely like to prevent, and if a person didn’t distribute the image at all them “getting away with it” doesn’t matter much.

        Edit: I think the argument that existing laws already cover this is more compelling than any of the above arguments as far as why this new law shouldn’t be passed.

        • @[email protected]
          link
          fedilink
          English
          16 months ago

          You conceded that no one cares if someone makes images locally then deletes them. But that’s how they’re all going to be made shortly.

          Currently folks are sharing them because not everyone has the means to create them, some folks do, and share what they’ve made.

          Once litterally every can just make them the moment they want to, no one will be sharing. Everyone will fall under that use case that you admitted no one would care about, which is exactly what I’ve been saying. It’s 1. futile to try to stop, and 2. going to become so wide spread that we as a society will stop caring about it.

          • @[email protected]
            link
            fedilink
            English
            3
            edit-2
            6 months ago

            Once litterally every can just make them the moment they want to, no one will be sharing.

            I do not think this is true. There are reasons to generate and distribute these other than to have a personal wank off gallery.

            • @[email protected]
              link
              fedilink
              English
              16 months ago

              Like what? Why share something when anyone curious to see it can instantly generate their own?

              • @[email protected]
                link
                fedilink
                English
                26 months ago

                I’m curious as to why you cannot come up with any yourself, but here are a few from the top of my head: to pass them off as authentic (likely for clout purposes), to have a laugh with the boys about it, to collaborate with others on them, and to distribute them to harass, ridicule, or disparage the target of them.

                Degenerates exist in lots of shapes and forms, and not all degenerates will have enough of a sense of shame to be degenerates privately or to even know they are being degenerates at all.

                • @[email protected]
                  link
                  fedilink
                  English
                  16 months ago

                  I don’t think you’re properly understanding the paradigm shift that’s coming with these models being open source and widely available while wearable AR smart glasses get better.

                  “You know Sharon is HR, look at this scandalous photo of her.”

                  “Uh, I’m seeing a live generated porno of everyone in this room right now, why would I care about that.”

                  • @[email protected]
                    link
                    fedilink
                    English
                    2
                    edit-2
                    6 months ago

                    And I don’t think you’re fully understanding that the above is some type of fantasy you have, and will not actually be what the future is like at all.