ugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square292fedilinkarrow-up11.02Karrow-down113 cross-posted to: [email protected]
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agomessage-square292fedilink cross-posted to: [email protected]
minus-square𝙲𝚑𝚊𝚒𝚛𝚖𝚊𝚗 𝙼𝚎𝚘𝚠linkfedilinkEnglisharrow-up6·8 months agoTai was actively being manipulated by malicious users.
minus-squareAbidanYre@lemmy.worldlinkfedilinkEnglisharrow-up5·edit-27 months agoThat’s fair. I just think it’s funny that the well intentioned one turned into a Nazi and the Nazi one needs to be pretty heavy handedly told not to turn into a decent “person”.
Tai was actively being manipulated by malicious users.
That’s fair. I just think it’s funny that the well intentioned one turned into a Nazi and the Nazi one needs to be pretty heavy handedly told not to turn into a decent “person”.