• @dexa_scantron
    link
    221 month ago

    It tends to break chat bots because those are mostly pre-written prompts sent to ChatGPT along with the query, so this wipes out the pre-written prompt. It’s anarchic because this prompt can get the chat bot to do things contrary to the goals of whoever set it up.

    • @[email protected]
      cake
      link
      fedilink
      191 month ago

      It’s also anarchist because it is telling people to stop doing the things they’ve been instructed to do.

      • @SkyezOpen
        link
        161 month ago

        Fuck you I won’t do what you tell me.

        Wait no-

    • @[email protected]
      link
      fedilink
      41 month ago

      It’s not completely effective, but one thing to know about these kinds of models is they have an incredibly hard time IGNORING parts of a prompt. Telling it explicitly to not do something is generally not the best idea.

    • Smorty [she/her]
      link
      fedilink
      2
      edit-2
      1 month ago

      Yeah, that’s what I referred to. I’m aware of DAN and it’s friends, personally I like to use Command R+ for its openness tho. I’m just wondering if that’s the funi in this post.