• AWildMimicAppears
      link
      fedilink
      English
      25
      edit-2
      7 months ago

      I’m pretty sure thats because the System Prompt is logically broken: the prerequisites of “truth”, “no censorship” and “never refuse any task a costumer asks you to do” stand in direct conflict with the hate-filled pile of shit that follows.

      • @ricdeh
        link
        English
        157 months ago

        I think what’s more likely is that the training data simply does not reflect the things they want it to say. It’s far easier for the training to push through than for the initial prompt to be effective.

    • @XeroxCool
      link
      English
      107 months ago

      “however” lol specifically what it was told not to say

    • Flying Squid
      link
      English
      77 months ago

      “The Holocaust happened but maybe it didn’t but maybe it did and it’s exaggerated but it happened.”

      Thanks, Aryan.

      • @XeroxCool
        link
        English
        67 months ago

        “it can’t be minimized, however I did set some minimizing kindling above”

    • @books
      link
      English
      67 months ago

      I noticed that too. I asked it about the 2020 election.