Setting aside the usual arguments on the anti- and pro-AI art debate and the nature of creativity itself, perhaps the negative reaction that the Redditor encountered is part of a sea change in opinion among many people that think corporate AI platforms are exploitive and extractive in nature because their datasets rely on copyrighted material without the original artists’ permission. And that’s without getting into AI’s negative drag on the environment.

  • deepblueseas@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    6
    ·
    9 months ago

    Yes, using existing works as reference is obviously something that real human artists do all the time, there’s no arguing that is the case. That’s how people learn to create art to begin with.

    But, the fact is, generative AI is not creative, nor does it understand what creativity is, nor will it ever. Because all it is doing is performing complex data statistical analysis algorithms to generate a matrix of pixels or a string of words.

    Im sorry, but the person entering in the prompt to instruct the algorithm is also not doing anything creative either. Do you think it is art to go through a fast food drive through and place an order? That’s what people are objecting to - people calling themselves artists because they put some nonsense word salad together and then think what they get out of it is some unique thing that they feel they created and take ownership of. If not for the AI model they are using and the creative works it was trained on, they could not have created it or likely even imagined it without it.

    People are actively losing their livelihoods because AI tech is being oversold and overhyped as something that it’s not. Execs are all jumping on the bandwagon and because they see AI as something that will save them a bunch of money, they are laying off people they think aren’t needed anymore. So, just try to incorporate that sentiment into your understanding of why people are also upset about AI. You may not be personally affected, but there are countless that are. In fact, over the next two years, as many as 203,000 entertainment workers in the US alone could be affected

    Generative AI Impact Study

    You want to have fun creating fancy kitbashed images based off of other people’s work, go right ahead. Just don’t call it art and call yourself an artist, unless you could actually make it yourself using practical skills.

    Also, good luck trying to copyright it because guess what, you can’t.

    https://crsreports.congress.gov/product/pdf/LSB/LSB10922

    • Even_Adder@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      7
      ·
      9 months ago

      Part 2

      Also, good luck trying to copyright it because guess what, you can’t.

      https://crsreports.congress.gov/product/pdf/LSB/LSB10922

      This looks like it’s set to change. The US Copyright Office is proactively exploring and evolving its understanding of this topic and are actively seeking expert and public feedback. You shouldn’t expect this to be their final word on the subject.

      It’s also important to remember Copyright Office guidance isn’t law. Their guidance reflects only the office’s interpretation based on experience, it isn’t binding in courts or other parties. Guidance from the office is not a substitute for legal advice, and it does not create any rights or obligations for anyone. They are the lowest rung on the ladder for deciding what law means.

      Let’s keep it civil and productive. Jeering dismissive language like “Also, good luck trying to copyright it because guess what, you can’t.” isn’t helping your argument, they’re just mean spirited. Let’s have a civil discussion, even if we disagree. I’m open to keep talking, but I will quit replying if you continue being disrespectful.

      • deepblueseas@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        5
        ·
        9 months ago

        It’s clear where you hold your stakes in the matter and where I hold mine. Whether or not you want to continue the conversation is up to you, but I’m not going to go out my way to be polite in the matter, because I don’t really give a shit either way or if you’re offended by what I say. AI personally affects me and my livelihood, so I do have passionate opinions about its use, how companies are adapting it and how it’s affecting other people like me.

        All the article you linked shows is that they held a meeting, which doesn’t really show anything. The government has tons of meetings that don’t amount to shit.

        So, instead of arguing whether or not the meeting actually shows they are considering anything different, I will explain my personal views.

        In general, I’m not against AI. It is a tool that can be effective in reducing menial tasks and increasing productivity. But, historically, such technology has done nothing but put people out of work and increased profits for the executives and shareholders. At every given chance, creatives and their work are devalued and categorized as “easy”, “childish” or not a real form of work, by those who do not have the capacity to do it themselves.

        If a company wants to adapt AI technology for creative use, it should solely trained off of content that they own the copyright to. Most AI models are completely opaque and refuse to disclose the materials they were trained on. Unless they can show me exactly what images were used to generate the output, then I will not trust that the output is unique and not plagiarizing other works.

        Fair use has very specific use cases where it’s actually allowed - parody, satire, documentary and educational use, etc. For common people, you can be DMCA’ed or targeted in other ways for even small offenses, like remixes. Even sites like archive.org are constantly under threat by lawsuits. In comparison, AI companies are seemingly being given free pass because of wide adoption, their lack of transparency, and the vagueness as to where specifically the output is being derived from. A lot of AI companies are trying to adapt opt-out to cover their asses, but this is only making our perception of their scraping practices worse.

        As we are starting to see with some journalism lawsuits, they are able to specifically point out where their work is being plagiarized, so I hope that more artists will speak up and also file suit for models where their work is blatantly being trained to mimic their styles. Because If someone can file copyright suit against another person for such matters, they should certainly be able to sue a company for the same unauthorized use of their work, when being used for profit.