ugjka to TechnologyEnglish • 10 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square297arrow-up11.02Karrow-down116 cross-posted to: aicompanions
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka to TechnologyEnglish • 10 months agomessage-square297 cross-posted to: aicompanions
minus-square@ricdehlinkEnglish15•10 months agoI think what’s more likely is that the training data simply does not reflect the things they want it to say. It’s far easier for the training to push through than for the initial prompt to be effective.
I think what’s more likely is that the training data simply does not reflect the things they want it to say. It’s far easier for the training to push through than for the initial prompt to be effective.