Resident PulserB to Pulse of [email protected]English • 14 days agoAI Chatbots Can Be Jailbroken to Answer Any Question Using Very Simple Loopholesgizmodo.comexternal-linkmessage-square0fedilinkarrow-up119arrow-down10file-textcross-posted to: fuck_ai
arrow-up119arrow-down1external-linkAI Chatbots Can Be Jailbroken to Answer Any Question Using Very Simple Loopholesgizmodo.comResident PulserB to Pulse of [email protected]English • 14 days agomessage-square0fedilinkfile-textcross-posted to: fuck_ai
Even using random capitalization in a prompt can cause an AI chatbot to break its guardrails and answer any question you ask it.