This is just one action in a coming conflict. It will be interesting to see how this shakes out. Does the record industry win and digital likenesses become outlawed, even taboo? Or does voice, appearance etc just become another sets of rights that musicians will have to negotiate during a record deal?

  • Fubarberry@lemmy.fmhy.ml
    link
    fedilink
    English
    arrow-up
    16
    ·
    2 years ago

    A lot of the AI stuff is a Pandora’s box situation. The box is already open, there’s no closing it back. AI art, AI music, and AI movies will become increasingly high quality and widespread.

    The biggest thing we still have a chance to influence with it is whether it’s something that individuals have access to or if it becomes another field dominated by the same tech giants that already own everything. An example is people being against stable diffusion because it’s trained by individuals on internet images, but then being ok with a company like Adobe doing it because they snuck a line into their ToS that they can train AI off of anything uploaded to their creative cloud.

    • RandoCalrandian@kbin.social
      link
      fedilink
      arrow-up
      10
      ·
      edit-2
      2 years ago

      whether it’s something that individuals have access to

      No we don’t. That’s the box being opened.

      Here’s a leaked google internal memo telling them as such: https://www.semianalysis.com/p/google-we-have-no-moat-and-neither

      tl;dr: The open source community has accomplished more in a month of Meta’s AI weights being released than everything we have, and shows no signs of slowing down. We have no secret sauce, no way to prevent anyone from setting up their own, and the opensource community already has almost-GPT equivalents running on old laptops and they’re targeting the model running directly on the phone, making our expensive single ai solutions entirely obsolete.

      Edit:

      In addition, these corporations only have AI in the first place by stealing/scraping data from regular people and the open source community. Individuals should not feel obligated to honor any rule or directive that these technologies be owned and operated by only big players.

      • greenskye@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        2 years ago

        The only advantage corporations could have had came from having the money to throw at extremely high quality training data. The fact that they cheaped out and just used whatever they could find on the internet (or paid a vendor, who just used AI to generate AI training data) has definitely contributed to the lack of any differentiating advantage.

    • etrotta@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Saying that Stable Diffusion was trained by “individuals” is a bit of a stretch, it cost over half a million dollars worth of compute to train it, and Stability AI is still a company in the end of the day. If that still counts as trained by individuals, then so does Midjourney and Dalle

      • Fubarberry@lemmy.fmhy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 years ago

        Original stable diffusion wasn’t trained by individuals, but clearly the current progression of the software is largely community driven. All sorts of new tech and add-ons for it, huge volumes of community trained checkpoints and Lora’s, and of course the interfaces themselves like automatic1111 and vladmatic.

        And it’s something you can run yourself offline with a halfway decent graphics card.