I thought OpenAI cracked down on jailbreaking ChatGPT… Is this possible again? Or do these people just post on TikTok the occasions where ChatGPT engages but hide the constant refusals to engage in role play, which also happen?
Dies anyone use ChatGPT as a companion an can enlighten me? Because I’ve switched to other models a long time ago.
I don’t think any of them would go to the point where they’d violate the TOS, and OpenAI is heading towards AI companionship based on their latest tech demo, add how the general populace aren’t aware of alternatives outside of proprietary software and it makes sense that many people are still using ChatGPT with a jailbreak.
Considering that the users are Chinese, perhaps it’s easier to jailbreak ChatGPT with a non-English language. There are English-speaking users using them today but they either use a milder version of the once-popular jailbreak or they’re secretly sharing the updated prompts through DMs
ETA: My bad, the user was conversing in English, but the latter explanation still applies
I thought OpenAI cracked down on jailbreaking ChatGPT… Is this possible again? Or do these people just post on TikTok the occasions where ChatGPT engages but hide the constant refusals to engage in role play, which also happen?
Dies anyone use ChatGPT as a companion an can enlighten me? Because I’ve switched to other models a long time ago.
I don’t think any of them would go to the point where they’d violate the TOS, and OpenAI is heading towards AI companionship based on their latest tech demo, add how the general populace aren’t aware of alternatives outside of proprietary software and it makes sense that many people are still using ChatGPT with a jailbreak.
To me it always just responds:
»I’m sorry, but I can’t comply with that request.«
Right after I send a Jailbreak. Also with 4o.
Considering that the users are Chinese, perhaps it’s easier to jailbreak ChatGPT with a non-English language. There are English-speaking users using them today but they either use a milder version of the once-popular jailbreak or they’re secretly sharing the updated prompts through DMs
ETA: My bad, the user was conversing in English, but the latter explanation still applies