cm0002@lemmy.cafe to Technology@lemmy.zipEnglish · 4 days agoResearchers Jailbreak AI by Flooding It With Bullshit Jargonwww.404media.coexternal-linkmessage-square15fedilinkarrow-up187arrow-down14cross-posted to: [email protected][email protected][email protected][email protected]
arrow-up183arrow-down1external-linkResearchers Jailbreak AI by Flooding It With Bullshit Jargonwww.404media.cocm0002@lemmy.cafe to Technology@lemmy.zipEnglish · 4 days agomessage-square15fedilinkcross-posted to: [email protected][email protected][email protected][email protected]
minus-squareiAvicenna@lemmy.worldlinkfedilinkEnglisharrow-up1·3 days agomakes sense though I wonder if you can also tweak the initial prompt so that the output is also full of jargon so that output filter also misses the context
minus-squareSheeEttin@lemmy.ziplinkfedilinkEnglisharrow-up1·3 days agoYes. I tried it, and it only filtered English and Chinese. If I told it to use Spanish, it didn’t get killed.
makes sense though I wonder if you can also tweak the initial prompt so that the output is also full of jargon so that output filter also misses the context
Yes. I tried it, and it only filtered English and Chinese. If I told it to use Spanish, it didn’t get killed.