• BoscoBear@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    9 months ago

    Sure, if that is what the network has been trained to do, just like a librarian will if that is how they have been trained.

      • BoscoBear@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        9 months ago

        Interesting article. It seems to be about a bug, not a designed behavior. It also says it exposes random excerpts from books and other training data.

          • BoscoBear@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            9 months ago

            That is a little like saying every photograph is a copy of the thing. That is just factually incorrect. I have many three layer networks that are not the thing they were trained on. As a compression method they can be very lossy and in fact that is often the point.