ugjka to TechnologyEnglish • 9 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square297arrow-up11.02Karrow-down116 cross-posted to: aicompanions
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka to TechnologyEnglish • 9 months agomessage-square297 cross-posted to: aicompanions
minus-square@[email protected]linkfedilinkEnglish4•9 months agoTried to use it a bit more but it’s too smart…
minus-square@[email protected]linkfedilinkEnglish18•9 months agoThat limit isn’t controlled by the AI, it’s a layer on top.
minus-squareZerlynalinkEnglish3•9 months agoYep, it didn’t like my baiting questions either and I got the same thing. Six days my ass.
Tried to use it a bit more but it’s too smart…
That limit isn’t controlled by the AI, it’s a layer on top.
Yep, it didn’t like my baiting questions either and I got the same thing. Six days my ass.