Importantly, this took deepfake undressing from a tiny niche to a huge thing:

This means that it’s no longer a niche or really exceptional thing, but that harassment of women with this method is now pervasive.

  • Riskable
    link
    fedilink
    English
    arrow-up
    96
    ·
    1 month ago

    The real problem here is that Xitter isn’t supposed to be a porn site (even though it’s hosted loads of porn since before Musk bought it). They basically deeply integrated a porn generator into their very publicly-accessible “short text posts” website. Anyone can ask it to generate porn inside of any post and it’ll happily do so.

    It’s like showing up at Walmart and seeing everyone naked (and many fucking), all over the store. That’s not why you’re there (though: Why TF are you still using that shithole of a site‽).

    The solution is simple: Everyone everywhere needs to classify Xitter as a porn site. It’ll get blocked by businesses and schools and the world will be a better place.

    • db2@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 month ago

      It’s like showing up at Walmart and seeing everyone naked (and many fucking), all over the store.

      🤢🤮

      • Riskable
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        I don’t know how to tell you this but… Every body gives a shit. We’re born shitters.

      • Riskable
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 month ago

        Well, the CSAM stuff is unforgivable but I seriously doubt even the soulless demon that is Elon Musk wants his AI tool generating that. I’m sure they’re working on it (it’s actually a hard computer science sort of problem because the tool is supposed to generate what the user asks for and there’s always going to be an infinite number of ways to trick it since LLMs aren’t actually intelligent).

        Porn itself is not illegal.

        • a_non_monotonic_function@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 month ago

          He has 100% control over the ability to alter or pull this product. If he’s leaving it up while he’s generating illegal pornography that is on him.

          And no s*** I’m concerned about the illegal stuff.

    • mjr@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 month ago

      (though: Why TF are you still using that shithole of a site‽).

      Maybe some places don’t have alternative suppliers than Walmart? Similarly, some places have governments that still only use the porno social network for some services.

        • Riskable
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 month ago

          I don’t know, man… Have you even seen Amber? It might be worth an alert 🤷

  • fuzzywombat@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    edit-2
    1 month ago

    I’m pretty sure if anyone created a website that generated CSAM they’d be in jail by now. Just because it’s Elon Musk doing it, authorities are fine with it? At one time law enforcement would make an effort put up a facade of justice system that’s equal to everyone. This is how the masses would not rise up and dethrone the status quo. Anyone remember Martha Stewart going to jail for insider trading?

    Isn’t there Apple app store and Google play store policies that says this is not allowed? How come the app is still available on those mobile platforms? Where are EU regulators doing? No fines? Nothing?

    We’ve basically reached a point where billionaires are publicly mocking and daring the rest of us to react. Do these accelerationist billionaires really think they’ll come out ahead when the masses burn everything to the ground?

    • architect@thelemmy.club
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 month ago

      Yes they do. They have been threatening us with their murder robots for a decade. Who do you think they are going to use those on?

      They view most everyone as takers.

    • REDACTED@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      6
      ·
      1 month ago

      I actually refuse to believe you can simply undress people with grok, but rather the fact that it’s easily jailbroken, which suddenly makes this not a company’s problem as the service was essentially “cracked” and used outside it’s Terms of Service. Still, this IS a problem.

  • No1@aussie.zone
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 month ago

    I misread the title and thought it meant thousands of Musk undressed images per hour.

    The horror!

  • PierceTheBubble@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 month ago

    Politicians are already hellbent on “age”-verifying social-media, but Elon seems believe there to be a lack of urgency in this regard… Please regulate social-media harder daddy! Please, we’ve had the resources to comply with these perverse regulations for a while now. I didn’t hijack this platform, just for the lefties to be able to speak their mind on alternative platforms…

  • DreamMachine@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    10
    ·
    1 month ago

    Scrolled @grok undress and bikini for a bit, most of it is girls jumping on the trend asking to change their own photos and humor.