• @[email protected]
    link
    fedilink
    English
    11 year ago

    But the thing is, it’s not similar to turning their work into a play or a TV show. You aren’t replicating their story at all, they put words in a logical order and you are using that to teach the AI what the next word logically could be.

    As for humans taking much more time to properly mimic style, of course that’s true (assuming untrained). But an AI requires far more memory and data to do that. A human can replicate a style with just examples of that style given time. An AI needs to scrape basically the entire internet (and label it, which takes quite some time) to be able to do so. They may need different things but it’s ridiculous to say that they’re completely incomparable. Besides, you make it sound like AI is it’s own entity that wasn’t created, trained, and used by humans in the first place.

    • just another dev
      link
      fedilink
      English
      1
      edit-2
      1 year ago

      It’s not the same as turning it into a play, but it’s doing something with it beyond its intended purpose, specifically with the intention to produce derivatives of it at an enormous scale.

      Whether or not a computer needs more or less of it than a human is not a factor, in my opinion. Actually, the fact that more input is required than for a human only makes it worse, since more of the creators work has to be used without their permission.

      Again, the reason why I think it’s incomparable is that when a human learns to do this, the damage is relatively limited. Even the best writer can only produce so many pages per day. But when a model learns to do it, the ability to apply it is effectively unlimited. The scale of the infraction is so exponentially more extreme, that I don’t think it’s reasonable to compare them.

      Lastly, if I made it sound like that, I apologise, that was not my intention. I don’t think it’s the models fault, but the people who decided to (directly or indirectly by not vetting their input data) take somebody’s copyrighted work and train an LLM on it.

      • @[email protected]
        link
        fedilink
        English
        11 year ago

        I don’t think the potential difference between how much damage can be caused is a reasonable argument. After all, economic damages to writers from others copying, plagiarizing their work or style or world is limited not because it’s hard for humans to do so, but because we made it illegal to make something so similar to another person’s copyrighted work.

        For example, Harry Potter has absolutely been copied to the extent legally allowed, but no one cares about any of those books because they’re not so similar that they affect the sales of Harry Potter at all. And that’s also true for AI. It doesn’t matter how closely it can replicate someone’s style or story if that replication can never be used or sold due to copyright infringement, which is already the case right now. Sure you can use it to generate thousands of books that are just different enough to not get struck down, but that wouldn’t affect the original book at all.

        Now, to be fair, with art you can be more similar to others art, because of how art works. But also, to be fair, the art market was never about how good an artist was, it was about how expensive the rich people who bought your art wanted it to be for tax purposes. And I doubt AI art is valuable for that.