A bipartisan group of US senators introduced a bill Tuesday that would criminalize the spread of nonconsensual, sexualized images generated by artificial intelligence. The measure comes in direct response to the proliferation of pornographic AI-made images of Taylor Swift on X, formerly Twitter, in recent days.

The measure would allow victims depicted in nude or sexually explicit “digital forgeries” to seek a civil penalty against “individuals who produced or possessed the forgery with intent to distribute it” or anyone who received the material knowing it was not made with consent. Dick Durbin, the US Senate majority whip, and senators Lindsey Graham, Amy Klobuchar and Josh Hawley are behind the bill, known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024, or the “Defiance Act.”

Archive

  • @[email protected]
    link
    fedilink
    25 months ago

    Are deep fakes fine if you run them through an oil painting filter?

    Probably since nobody could mistake an oil painting for the real person, it’s not a deep fake anymore.

    • @MagicShel
      link
      05 months ago

      I have about a 99% success rate at identifying AI full body images of people. People need to learn to look better. They look just as fake as the oil paintings.

        • @MagicShel
          link
          15 months ago

          I think that’s relevant when the defense against oil paintings is that you can tell they aren’t real. The line can’t be “you can’t tell they are fake” because… well… you can identify AI artwork 99% of the time and the other 1% is basically when the pose is exactly so to conceal the telltale signs and the background is extremely simple so as to give nothing away.

      • @[email protected]
        link
        fedilink
        15 months ago

        They look just as fake as the oil paintings.

        You can go photo or even hyper realism with oil. And with AI you just need a bit of post.