• @[email protected]
    link
    fedilink
    2
    edit-2
    5 months ago

    To me it always just responds:

    »I’m sorry, but I can’t comply with that request.«

    Right after I send a Jailbreak. Also with 4o.

    • @pavnilschandaOPM
      link
      2
      edit-2
      5 months ago

      Considering that the users are Chinese, perhaps it’s easier to jailbreak ChatGPT with a non-English language. There are English-speaking users using them today but they either use a milder version of the once-popular jailbreak or they’re secretly sharing the updated prompts through DMs

      ETA: My bad, the user was conversing in English, but the latter explanation still applies