cross-posted from: https://kbin.social/m/[email protected]/t/525635

A nightmare scenario previously only imagined by AI researchers, where AI image generators accidentally spit out non-consensual pornography of real people, is now reality.

  • Funkwonker@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Oh no, if it isn’t the consequences of their actions.

    Really shouldn’t have used training data that was obtained without consent.