return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 4 months agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square33linkfedilinkarrow-up1210arrow-down17
arrow-up1203arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 4 months agomessage-square33linkfedilink
minus-squareCodenameDarlen@lemmy.worldlinkfedilinkEnglisharrow-up1·edit-24 months agodeleted by creator
deleted by creator