Remember how we were told that genAI learns “just like humans”, and how the law can’t say about fair use, and I guess now all art is owned by big tech companies?

Well, of course it’s not true. Exploiting a few of the ways in which genAI --is not-- like human learners, artists can filter their digital art in such a way that if a genAI tool consumes it, it actively reduces the quality of the model, undoing generalization and bleading into neighboring concepts.

Can an AI tool be used to undo this obfuscation? Yes. At scale, however, doing so requires increasing compute costs more and more. This also looks like an improvable method, not a dead end – adversarial input design is a growing field of machine learning with more and more techniques becoming highly available. Imagine this as sort of “cryptography for semantics” in the sense that it presents asymetrical work on AI consumers (while leaving the human eye much less effected).

Now we just need labor laws to catch up.

Wouldn’t it be funny if not only does generative AI not lead to a boring dystopia, but the proliferation and expansion of this and similar techniques to protect human meaning eventually put a lot of grifters out of business?

We must have faith in the dark times. Share this with your artist friends far and wide!

  • corbin@awful.systems
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    10 months ago

    I can’t endorse Glaze or Nightshade, sorry. If literally nothing else, it’s not Free Software and it’s offered with a nasty license:

    You are not permitted to … reverse engineer the Software …

    You are not permitted to … permit … any part of the Software … to be combined with or become incorporated in any other software …

    So I’m not allowed to have the discussion I’m currently having, nor to include it in any Linux distro. To me, that’s useless at best and malicious at worst. Ironic, considering that their work directly builds upon Stable Diffusion.

    Also, Nightshade will be ineffective as an offensive tool. Quoting from their paper:

    … the perturbations we optimized on poison images are able to perturb image’s features in text-to-image models, but they have limited impact on the features extracted by alignment models. … We note that it might be possible for model trainers to customize an alignment model to ensure high transferability with poison sample generation, thus making it more effective at detecting poison samples.

    This is not only an admission of failure but a roadmap for anybody who wants to work around Nightshade. Identify poisoned images by using an “alignment model,” which correlates images with sets of labels, to test whether an image is poorly labeled; if the image appears well-labeled to a human but not to an alignment model, then it may be poisoned and will need repair/corroboration from alternate sources.

    I also ranted about this on Mastodon.

    • froztbyte@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      I wasn’t aware of the licensing terms - that immediately drops my opinion of it too. I was thinking about looking into it (soon.gif, due spoons) to see if I could make a run-your-own-nightshade deployable for people, but I guess that’s off the table now

      as to effectiveness: as I’ve said elsewhere, the genie is out of the bottle. unless this shit gets (toothfully) regulated out of existence (and that might be impossible for a variety of reasons), I fear that it’s going to become a similar arms race as spam, and there will continue to be a tug-of-war for a while.

      gutfeel, it strikes me that the (current) biggest hope is that the models themselves not only fail at providing but don’t really have a path to achieving that either, so it’s mostly a case of how long VCs can fund the hype. from previous cycles it looks like that spans 2~5y windows. at the point the hype funding runs out, this stuff will significantly lose traction even though it won’t disappear entirely just yet.

      damn large amount of damage that’ll happen in

    • Amoeba_Girl@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      10 months ago

      If you want to hurt the capitalists, consider exfiltrating weights directly, as was done with LLaMa, to ruin their moats.

      Could you tell us more about what you’re referring to here? Thanks!

    • locallynonlinear@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      edit-2
      10 months ago

      Ha! Nope, not buying it.

      nasty license Ironic, considering that their work directly builds upon Stable Diffusion.

      Funny you mention licenses, since stable diffusion and leading AI models were built on labor exploitation. When this issue is finally settled by law, history will not look back well on you.

      So I’m not allowed to have the discussion I’m currently having

      Doesn’t seem to prevent you from doing it anyways. Does any license slow you down? Nope.

      nor to include it in any Linux distro

      Not sure that’s true, but also unnecessary. Artists don’t care about this or need it to be. I think it’s a disengenous argument, made in the astronaut suit you wear on the high horse drawn from work you stole from other people.

      This is not only an admission of failure but a roadmap for anybody who wants to work around Nightshade.

      Sounds like an admission of success given that you have to step out of the shadows to tell artists on mastodon not to use it because, ahem, license issues???

      No. Listen. The point is to alter the economics, to make training on image from the internet actively dangerous. It doesn’t even take much. A small amount of internet data actively poisoned requires future models to use alignment to bypass it, increasing the marginal (thin) costs of training and cheating people out of their work.

      Shame on you dude.

      If you want to hurt the capitalists, consider exfiltrating weights directly, as was done with LLaMa, to ruin their moats.

      Good luck on competing in the arms race to use other people’s stuff.

      @[email protected] can we ban the grifter?

      • bitofhope@awful.systems
        link
        fedilink
        English
        arrow-up
        14
        ·
        10 months ago

        I am buying it. I don’t think @[email protected] is pro AI art, just that countermeasures like Glaze and Nightshade are not great either, and I agree.

        Artists don’t care about this or need it to be.

        I care about it. Some artists use Debian. Please don’t shit on people who care about software freedom, even if you don’t.

        Making artwork unusable by exploitative machine learning models is cool and based, but using a proprietary tool that’s itself made by from the same pool of exploited artists’ work is less so.

        • froztbyte@awful.systems
          link
          fedilink
          English
          arrow-up
          11
          ·
          edit-2
          10 months ago

          yeah, seconding. nothing I’ve seen of @corbin’s posting, here or otherwise, leads me to think that they’re in favour of exploitation or the numerous other issues involved in this shit

      • self@awful.systemsM
        link
        fedilink
        English
        arrow-up
        11
        ·
        10 months ago

        @[email protected] can we ban the grifter?

        corbin’s track record both on and off awful.systems indicates they aren’t any kind of grifter

        in fact, myself and @[email protected] have previously had a conversation about this type of technology (Nightshade and Glaze) where I was initially quite excited about it, but David and others brought up a lot of the same points corbin did here. there were some very solid social points made around the tech too, beyond the licensing and technical points we’ve seen here — should we really be establishing the expectation that artists need to defend their work using this specific proprietary technology? that feels way too close to the bullshit the NFT grifters pulled, where artists could opt out of their work being stolen and sold as an NFT only by following a specific set of steps for each and every NFT market, which doesn’t work. this kind of tech also opens the door for rent-seeking; techniques like Glaze and Nightshade can be broken by changes to generative models, which would keep artists on a treadmill continually paying for the latest versions of these proprietary tools, or else. it feels rather like a protection racket run by whoever has the most access to the models — and that’ll always be the same assholes who run the generative AI.

        so I ended up with the strong impression that this technology won’t make things better for artists and other folks who are being exploited by the AI industrial complex; that this might not be something with a purely technical solution. and I think I understand your strong reaction to some of the posts here, because that fucking sucks. there isn’t a clean engineering solution to this problem that my increasingly technofascist industry created, and when you grow up being told (by some of the same techfash fucks who’re now behind some of the worst use of technology I can think of) that all you do is engineering, it’s easy to feel helpless.

        but we aren’t helpless. technofascism is structured to produce and exploit that feeling when it’s engaged with on a purely technical level, but the systems established by technofascism (LLMs, generative AI, cryptocurrencies, and others) are plainly ridiculous when viewed through any other lens. the technofascist goal isn’t to win on technical merit (there isn’t any), but to normalize ridiculousness. the only way I know to push back against that is social. on a small scale, that’s part of what sneering is — any asshole pushing this ridiculous shit should feel ridiculous doing it, as a lot of the crypto grifters felt when the public at large started sneering at crypto (thanks to the efforts of David, Amy, Molly, and many others). on a larger scale, we desperately need systemic change. as engineers, we’re constantly told we don’t need unions or solidarity, particularly with folks like artists who we’re told are unimportant. it is very intentional that attitudes like that enable technofascism.

        if and when we have those social factors established, a version of these tools with less potential for exploitation might be worth considering. but I see it kind of like the relationship between fediverse software and its community — federation is generally a good thing, but it’s absolutely nothing (and would probably be a net negative) without posters who generally want the fediverse to be a cozy place to make good posts; the polar opposite of the utterly hostile commercialized thing the internet at large has become.

      • corbin@awful.systems
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 months ago

        I don’t care whether you agree with me. I do think it’s quite interesting that my critique of the topic of your post has you replying with personal insults; I think that you’ve incorrectly assumed that I am one of the capitalists who build these models, rather than an anti-capitalist who encourages destroying corporations.

        When this issue is finally settled by law, history will not look back well on you.

        Who are you, Lars Ulrich? Copyright is not compatible with information theory, and so far, information theory has won every contest in the court of public opinion.

        The point is to alter the economics …

        Help defray or establish artists’ funds. Help uncover and prosecute wage theft. Advocate for basic income. If your goal is to remunerate artists, then focus on the efforts which actually help them; don’t support copyright, as it is neither designed nor implemented to help individual artists.

        Shame on you …

        Which one of us is defending a paper which explicitly offers itself as a way for Disney to protect art which it appropriated from artists?

      • 200fifty@awful.systems
        link
        fedilink
        English
        arrow-up
        7
        ·
        10 months ago

        I don’t think they were defending ai necessarily, just saying they had objections to the specific technique used by these tools. I do think that not open-sourcing the thing is probably defensible given that it exists in an adversarial context, but the technical concerns are worth being aware of

      • Evinceo@awful.systems
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        10 months ago

        Does any license slow you down

        I certainly comply with software licenses so yes, that does slow me down. As they pointed out this precludes it from appearing in Linux distros and such.

        Incidentally, I’ve gotten into very long and stupid arguments with people about Stable Diffusion’s Definitely Not Open Fucking Source license.

      • SinAdjetivos@beehaw.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        10 months ago

        I’ll bite…

        1. Why do argue is it okay to use this “poison pill” which was created using the same “stolen data” you are railing against and not other, more open, tools?

        2. You underestime how much labor already goes into cleaning/filtering datasets, this does not change the economics nearly as much as you want it to. Do you not think there are any filters in place?

        3. If you want to get into it everything in the modern world is built on labor exploitation. When this issue is finally settled by the law™ it will be in the favor of increased capital consolidation built upon the arguments and misinformation you are repeating. If the law™ expands copyright protections who do you think that benefits the most?

        4. Your tone, lack of research into the topic, calling others grifters, and attempts to silence any contradictory viewpoints isn’t great… Why are you having such a a visceral response to this topic and where do your talking points come from?

  • Amoeba_Girl@awful.systems
    link
    fedilink
    English
    arrow-up
    12
    ·
    10 months ago

    The radical thing I chose to do, as an artist, is to make low quality human art that is of no value to the capitalist kitsch industry.

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Amazing!

    Edit: pursuant to @[email protected]‘s commentary: I didn’t read their literature particularly critically. It would be great if artists had a way to deter AI thievery, and this product looked like it might be that. But I guess it’s not. ¯\_(ツ)_/¯

    • David Gerard@awful.systems
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 months ago

      it’s fun seeing an attempt to poison the AI datasets.

      we bought some notebooks for the kid for xmas and were horrified that the designs that looked nice in the thumbnail on Amazon appeared to actually be AI churned glurge (wife and kid are both artists with Opinions on this shit)

      • swlabr@awful.systems
        link
        fedilink
        English
        arrow-up
        7
        ·
        10 months ago

        Oh, that sucks. That brings to mind an episode of the “mystery show” podcast where someone had a lunchbox with art based on the old sitcom “Welcome back, Kotter” that depicted some scene that seemed incongruous. The episode then went into a deep dive about the show, the show’s creator, the lunchbox company, associates of the now/then-deceased artist that designed the lunchbox, and many others, all to try and justify the existence of that seemingly incorrect design.

        While I doubt that any significant number of people would ever investigate the design of merchandising like the above, it is nice to think about the stories behind objects. In the quest to make things cheaper to create, we lose stories like this, and the world gets a bit darker.

  • I_Has_A_Hat@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    11
    ·
    10 months ago

    AI is freeing art in the same way technology did music. It’s putting creative tools in the hands of more and more people that don’t need years/decades of skill to become proficient in.

    50 years ago, you’d need several talented musicians, an array of expensive instruments and recording equipment to make a song. Now it takes one person and a computer. “Real” musicians turned their nose up at digital music when it first came out, but that’s because they were afraid of the truth: while this new technology might not be as good as the top .01% of musicians, most musicians don’t fall into that category and the rest of the world sees the new way as faster, easier, and frankly better.

    AI art is the same way. While it may not be better than the absolute best artists in the world, it’s better than 99.9% of them. The common person can now create detailed depictions of art in seconds with a few prompts and keywords. There’s no going back from this, and public opinion is not going to reject it. It’s too convenient, too easy, too beneficial of a tool.

    Art is dead, long live Art.

    • gerikson@awful.systems
      link
      fedilink
      English
      arrow-up
      12
      ·
      10 months ago

      I mean if you equate “art == content”, then sure, all those billions in VC money and poorly paid data entry workers have now enabled a machine that for a monthly fee will provide you with a hero image on your Medium blog. The market for dreck like that is limited, though.

    • Evinceo@awful.systems
      link
      fedilink
      English
      arrow-up
      11
      ·
      10 months ago

      “Real” musicians turned their nose up at digital music when it first came out

      You got any evidence for that? What do you even mean by “digital music?” That’s not a category I think musicians recognize. You may he confusing digital with electric which was not exactly as you’re describing it.

    • bitofhope@awful.systems
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 months ago

      Ah yes, “real” musicians turned their nose up at digital music.

      Now let me tell you about how making music on a computer doesn’t involve years of practice to become proficient. Using a digital audio workstation is basically the same as typing “club banger in the style of Avicii, billboard hot 100, high quality, big boobs, crisp production, lots of bass, danceable” in a prompt box.

      • Amoeba_Girl@awful.systems
        link
        fedilink
        English
        arrow-up
        13
        ·
        edit-2
        10 months ago

        also sorry i need to get this out

        50 years ago, you’d need several talented musicians, an array of expensive instruments and recording equipment to make a song.

        no. you can make up a song and you can sing it and it doesn’t cost anything. i’ve done it since i was a child and humanity has done it since before history began. fuck you.

    • self@awful.systemsM
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 months ago

      Art is dead, long live Art.

      @[email protected] already pointed this out, but it’s kind of fucking amazing how obvious it is that this trash sounded smart to you when you wrote it, but in the same post you’ve made a bunch of mistakes that give away you don’t even have the vaguest idea of what art is or how it’s made

      • Amoeba_Girl@awful.systems
        link
        fedilink
        English
        arrow-up
        10
        ·
        10 months ago

        the year is 2034. i am robbing a bank to pay the openAI licencing fee so i can pursue my dream of being a writer.

      • froztbyte@awful.systems
        link
        fedilink
        English
        arrow-up
        9
        ·
        10 months ago

        “Democratizing art!!!” really had me chuckling for a second, but only a second. The bit was already used up by the coiners

      • self@awful.systemsM
        link
        fedilink
        English
        arrow-up
        10
        ·
        10 months ago

        fucking hell I looked at their post history. it’s only a 5 day old account but they’ve been busy posting bad AI takes almost exclusively, including comparing the dudesly George Carlin thing with a drag show