shish_mish to TechnologyEnglish • 11 months agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square25arrow-up1299arrow-down14cross-posted to: [email protected]
arrow-up1295arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comshish_mish to TechnologyEnglish • 11 months agomessage-square25cross-posted to: [email protected]
minus-square@oDDmONlinkEnglish58•11 months ago …researchers from NTU were working on Masterkey, an automated method of using the power of one LLM to jailbreak another. Or: welcome to where AI becomes an arms race.
Or: welcome to where AI becomes an arms race.
This is how skyNet starts.