• @Grimy
    link
    English
    212 months ago

    They already got rid of the loophole a long time ago. It’s a good thing tbh since half the people using local models are doing it because OpenAI won’t let them do dirty roleplay. It’s strengthening their competition and showing why these closed models are such a bad idea, I’m all for it.

    • @felixwhynot
      link
      English
      22 months ago

      Did they really? Do you mean specifically that phrase or are you saying it’s not currently possible to jailbreak chatGPT?

      • @Grimy
        link
        English
        3
        edit-2
        2 months ago

        They usually take care of a jailbreak the week its made public. This one is more than a year old at this point.