misk@sopuli.xyz to Technology@lemmy.worldEnglish · 9 months agoJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square67fedilinkarrow-up1480arrow-down116
arrow-up1464arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 9 months agomessage-square67fedilink
minus-squareMonkderZweite@feddit.chlinkfedilinkEnglisharrow-up8·9 months agoUh, wait, the jelly i once made in my dads workshop by dissolving styrofoam in gasoline, was already a napalm substitute?
minus-squareGONADS125@lemmy.worldlinkfedilinkEnglisharrow-up3·9 months agoYou’re missing a key ingredient from a garden center…
Uh, wait, the jelly i once made in my dads workshop by dissolving styrofoam in gasoline, was already a napalm substitute?
You’re missing a key ingredient from a garden center…