themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 7 hours agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square41fedilinkarrow-up1283arrow-down15cross-posted to: [email protected][email protected][email protected][email protected][email protected]
arrow-up1278arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 7 hours agomessage-square41fedilinkcross-posted to: [email protected][email protected][email protected][email protected][email protected]
minus-squareyesman@lemmy.worldlinkfedilinkEnglisharrow-up10·6 hours agoLOL, You mean the letters C and P can stand for lots of stuff. At first I thought you meant the term “child porn” was ambiguous.
minus-squaredrdiddlybadger@pawb.sociallinkfedilinkEnglisharrow-up4·6 hours agoWeirdly people have also been intentionally diluting the term to expand it to other things which causes a number of legal issues.
LOL, You mean the letters C and P can stand for lots of stuff. At first I thought you meant the term “child porn” was ambiguous.
Weirdly people have also been intentionally diluting the term to expand it to other things which causes a number of legal issues.