WormGPT Is a ChatGPT Alternative With ‘No Ethical Boundaries or Limitations’::undefined

  • KairuByte
    link
    English
    41 year ago

    Not joking actually. Problem with jailbreak prompts is that they can result in your account catching a ban. I’ve already had one banned, actually. And eventually you can no longer use your phone number to create a new account.