There are uses of AI that are proving to be more than black and white. While voice actors, have protested their performances being fed into AI against their will, we are now seeing an example of this being done, with permission, in a very unique case.

  • Mongostein@lemmy.ca
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    5
    ·
    1 year ago

    Yes. This time, although it was permission from the family, not the actor. Should that be allowed?

        • Mongostein@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Is it? Or is it that I have other things to do? 🙄

          I haven’t fully formed an opinion on this topic, but to me it seems wrong to use someone’s likeness without their permission. I understand that the family gave permission, which is legally ok, but is it morally ok?

          I’m not sure. I think it should be something negotiated before their death.

    • BetaDoggo_@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Depends on whether a voice is considered a copyrightable asset. If it is it would have transfered to the family when he died so they could give permission. If not CDPR legally wouldn’t be required to get consent anyway. New regulation is probably going to be written to clarify issues like this.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        Depends on whether a voice is considered a copyrightable asset.

        It isn’t, a voice is not an expression and hardly tangible, you can copyright a voice as much as you can copyright a violin, or a style of play: You can’t. But as we’re talking about a person and not an object it is use of someone’s likeness, which is part of personality rights.

        • BB69@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Once somebody is dead, their estate because their representative. It’s up to the estate to make the call at that point.

    • theneverfox@pawb.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Why not?

      Your likeness is basically IP, if it’s worth anything you can put it in your estate, if its worth a lot you can set up a trust to manage it, and I’m sure there’s some sort of legal shenanigans you can do to make it thorny to use

      I mean you’re dead. If your family sucks and you’re worried they’ll use your voice or face for something evil, you could make it public domain to trash the value, if you care about your legacy, well… Look upon my works and despair and all that. You can burn your estate to protect it for a lifetime or two, or set up a trust to fund itself by selling use of the license according to certain standards… Eventually it’ll either warp into something very different than your body of work (for better or worse), or you’ll fade into obesity before the lawyer money runs out - so it’ll just stop

      A lot of people say “AI is bad” when what they really mean is “AI is powerful; corporations are bad; I don’t want the evil artificial intelligence made by lawyers to misuse the artificial intelligence made by math and human media”

      And kind of like AI, corporations are a tool. They suffer an alignment problem way worse than AI, so trusting them with digital technology like networking has been mostly disastrous, sometimes quite good, but mostly neutral.

      This use of AI to use a dead person’s likeness isn’t good or bad… It’s just neutral. There’s no greater issue here than the media industry getting alternatives to human talent - the people are dead, some legacies might corrode faster, but there’s no legal hack or big moral peril here.

      There are people who lived in the small window of good enough recording/storage to be useful for this tech to be useful, died before it was inevitable, but were still recognizable before it “disrupts” media entirely in a year.

      With another year, the consumer grade abilities will go from currently “uncanny similar voices with a short sample” to “indistinguishable from the original voice”… We’re very close to the point where the likeness debate becomes moot because hobbyists can deepfake 4k video for shitposts

      • Mongostein@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Because, as you said, some people don’t get along with their families and it could be used maliciously.

        I suppose that could be solved in a will though.