@[email protected] to [email protected]English • 9 months agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square0fedilinkarrow-up19arrow-down10cross-posted to: technology
arrow-up19arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.com@[email protected] to [email protected]English • 9 months agomessage-square0fedilinkcross-posted to: technology