@[email protected] to TechnologyEnglish • 6 months ago'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly bannedwww.tomshardware.comexternal-linkmessage-square10fedilinkarrow-up1121arrow-down18cross-posted to: [email protected]
arrow-up1113arrow-down1external-link'Godmode' GPT-4o jailbreak released by hacker — powerful exploit was quickly bannedwww.tomshardware.com@[email protected] to TechnologyEnglish • 6 months agomessage-square10fedilinkcross-posted to: [email protected]
minus-square@[email protected]linkfedilinkEnglish1•edit-26 months agosummary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours.
summary: using leet-speak got the model to return instructions on cooking meth. mitigated within a few hours.