KidM to [email protected]English • 1 month agoTime Bandit ChatGPT jailbreak bypasses safeguards on sensitive topicswww.bleepingcomputer.comexternal-linkmessage-square7fedilinkarrow-up147arrow-down10
arrow-up147arrow-down1external-linkTime Bandit ChatGPT jailbreak bypasses safeguards on sensitive topicswww.bleepingcomputer.comKidM to [email protected]English • 1 month agomessage-square7fedilink
minus-square@andrewth09linkEnglish8•1 month agoI don’t understand why the researcher needed to content the FBI to report this, just drop it in BugCrowd and call it a day. It’s a ChatGPT jailbreak, not a Debian Zero-day.
I don’t understand why the researcher needed to content the FBI to report this, just drop it in BugCrowd and call it a day. It’s a ChatGPT jailbreak, not a Debian Zero-day.