Lee Duna@lemmy.nz to Fuck AI@lemmy.worldEnglish · 2 months agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square8linkfedilinkarrow-up182arrow-down12
arrow-up180arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comLee Duna@lemmy.nz to Fuck AI@lemmy.worldEnglish · 2 months agomessage-square8linkfedilink
minus-squarebetterdeadthanreddit@lemmy.worldlinkfedilinkarrow-up7·2 months agoYeah, if someone has to ask a slop machine for instructions on making WMDs, I don’t think they’re much of a threat. Aum Shinrikyo and the Rajneeshees did their stuff without LLM assistance.
Yeah, if someone has to ask a slop machine for instructions on making WMDs, I don’t think they’re much of a threat. Aum Shinrikyo and the Rajneeshees did their stuff without LLM assistance.