@[email protected] to [email protected] • 2 years agoI'm sorry Dave (Originally posted by u/samcornwell on Reddit)sh.itjust.worksimagemessage-square8fedilinkarrow-up1370arrow-down18cross-posted to: [email protected]
arrow-up1362arrow-down1imageI'm sorry Dave (Originally posted by u/samcornwell on Reddit)sh.itjust.works@[email protected] to [email protected] • 2 years agomessage-square8fedilinkcross-posted to: [email protected]
minus-square@coldvlink9•edit-22 years agoThis is a reference to people finding AI chatbots loopholes to get it to say stuff they’re not allowed to say, like the recipe for napalm. It would tell you if you ask it to pretend they’re a relative. https://www.polygon.com/23690187/discord-ai-chatbot-clyde-grandma-exploit-chatgpt
This is a reference to people finding AI chatbots loopholes to get it to say stuff they’re not allowed to say, like the recipe for napalm. It would tell you if you ask it to pretend they’re a relative.
https://www.polygon.com/23690187/discord-ai-chatbot-clyde-grandma-exploit-chatgpt