shish_mish to TechnologyEnglish • 10 months agoResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comexternal-linkmessage-square25arrow-up1299arrow-down14cross-posted to: [email protected]
arrow-up1295arrow-down1external-linkResearchers jailbreak AI chatbots with ASCII art -- ArtPrompt bypasses safety measures to unlock malicious querieswww.tomshardware.comshish_mish to TechnologyEnglish • 10 months agomessage-square25cross-posted to: [email protected]
minus-square@Harbinger01173430linkEnglish0•10 months agoThat’s how open software works. It’s there for anyone to do whatever they want with it. Bonus if you charge money for it
That’s how open software works. It’s there for anyone to do whatever they want with it. Bonus if you charge money for it