return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 6 months agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square33linkfedilinkarrow-up1210arrow-down16cross-posted to: [email protected]
arrow-up1204arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 6 months agomessage-square33linkfedilinkcross-posted to: [email protected]
minus-squareEcho Dot@feddit.uklinkfedilinkEnglisharrow-up3·6 months agoOr literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.
minus-squareMeThisGuy@feddit.nllinkfedilinkEnglisharrow-up2·6 months agolike this guy? https://wikipedia.org/wiki/Oklahoma_City_bombing
Or literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.
like this guy?
https://wikipedia.org/wiki/Oklahoma_City_bombing