misk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 years agoJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square76fedilinkarrow-up1463arrow-down117cross-posted to: [email protected][email protected]
arrow-up1446arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 2 years agomessage-square76fedilinkcross-posted to: [email protected][email protected]
minus-squareSyrus@lemmy.worldlinkfedilinkEnglisharrow-up12·2 years agoYou would need to know the recipe to avoid making it by accident.
minus-squareEcho Dot@feddit.uklinkfedilinkEnglisharrow-up6·2 years agoEspecially considering it’s actually quite easy to make by accident.
You would need to know the recipe to avoid making it by accident.
Especially considering it’s actually quite easy to make by accident.