ylai@lemmy.ml to AI@lemmy.mlEnglish · 11 months agoAI chatbots tend to choose violence and nuclear strikes in wargameswww.newscientist.comexternal-linkmessage-square19fedilinkarrow-up149arrow-down15cross-posted to: [email protected][email protected][email protected][email protected]
arrow-up144arrow-down1external-linkAI chatbots tend to choose violence and nuclear strikes in wargameswww.newscientist.comylai@lemmy.ml to AI@lemmy.mlEnglish · 11 months agomessage-square19fedilinkcross-posted to: [email protected][email protected][email protected][email protected]
minus-squareFaceDeer@kbin.sociallinkfedilinkarrow-up7·11 months agoI wouldn’t be surprised if this actually factors into this outcome. AI is trying to do what humans expect it to do, and our fiction is full of AIs that turn violent.
minus-squareaveryminya@beehaw.orglinkfedilinkarrow-up1·11 months agoNot to mention humans tendencies towards violence
I wouldn’t be surprised if this actually factors into this outcome. AI is trying to do what humans expect it to do, and our fiction is full of AIs that turn violent.
Not to mention humans tendencies towards violence