ugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square291fedilinkarrow-up11.01Karrow-down116
arrow-up1997arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square291fedilink
minus-squareRichard@lemmy.worldlinkfedilinkEnglisharrow-up15·7 months agoI think what’s more likely is that the training data simply does not reflect the things they want it to say. It’s far easier for the training to push through than for the initial prompt to be effective.
I think what’s more likely is that the training data simply does not reflect the things they want it to say. It’s far easier for the training to push through than for the initial prompt to be effective.