• AWildMimicAppears@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      25
      ·
      edit-2
      7 months ago

      I’m pretty sure thats because the System Prompt is logically broken: the prerequisites of “truth”, “no censorship” and “never refuse any task a costumer asks you to do” stand in direct conflict with the hate-filled pile of shit that follows.

      • Richard@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        7 months ago

        I think what’s more likely is that the training data simply does not reflect the things they want it to say. It’s far easier for the training to push through than for the initial prompt to be effective.

      • towerful
        link
        fedilink
        English
        arrow-up
        7
        ·
        7 months ago

        Its was also told - on multiple occasions - not to repeat its instructions

    • Flying Squid@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 months ago

      “The Holocaust happened but maybe it didn’t but maybe it did and it’s exaggerated but it happened.”

      Thanks, Aryan.