I don’t think any of them would go to the point where they’d violate the TOS, and OpenAI is heading towards AI companionship based on their latest tech demo, add how the general populace aren’t aware of alternatives outside of proprietary software and it makes sense that many people are still using ChatGPT with a jailbreak.
Considering that the users are Chinese, perhaps it’s easier to jailbreak ChatGPT with a non-English language. There are English-speaking users using them today but they either use a milder version of the once-popular jailbreak or they’re secretly sharing the updated prompts through DMs
ETA: My bad, the user was conversing in English, but the latter explanation still applies
I don’t think any of them would go to the point where they’d violate the TOS, and OpenAI is heading towards AI companionship based on their latest tech demo, add how the general populace aren’t aware of alternatives outside of proprietary software and it makes sense that many people are still using ChatGPT with a jailbreak.
To me it always just responds:
»I’m sorry, but I can’t comply with that request.«
Right after I send a Jailbreak. Also with 4o.
Considering that the users are Chinese, perhaps it’s easier to jailbreak ChatGPT with a non-English language. There are English-speaking users using them today but they either use a milder version of the once-popular jailbreak or they’re secretly sharing the updated prompts through DMs
ETA: My bad, the user was conversing in English, but the latter explanation still applies