• Zagorath@aussie.zone
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    If they’re inserting random race words in, presumably there’s some kind of preprocessing of the prompt going on. That preprocessor is what would need to know if the character is specific enough to not apply the race words.

    • Big P@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Yeah but replace("guy", "ethnically ambiguous guy") is different than does this sentence reference any possible specific character

      • stifle867
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        I don’t think it’s literally a search and replace but a part of the prompt that is hidden from the user and inserted either before or after the user’s prompt. Something like [all humans, unless stated otherwise, should be ethnically ambiguous]. Then when generating it’s got confused and taken it as he should be named ethnically ambiguous.

        • intensely_human@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          It’s not hidden from the user. You can see the prompt used to generate the image, to the right of the image.

    • intensely_human@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Gee, I wonder if there’s any way to use GPT-4 to detect whether a prompt includes reference to any specific characters. 🤔