• @[email protected]
    link
    fedilink
    89 months ago

    They did literally nothing and seem to use the default stable diffusion model which is supposed to be a techdemo. Would have been easy to put “(((nude, nudity, naked, sexual, violence, gore)))” as the negative prompt

    • @[email protected]
      link
      fedilink
      79 months ago

      The problem is that negative prompts can help, but when the training data is so heavily poisoned in one direction, stuff gets through.