shish_mish to TechnologyEnglish • 1 year agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square25arrow-up1299arrow-down14cross-posted to: [email protected]
arrow-up1295arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comshish_mish to TechnologyEnglish • 1 year agomessage-square25cross-posted to: [email protected]
you can also possibly sub in 🔫 if “waterguns” are nono