themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 month agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square102fedilinkarrow-up1550arrow-down119cross-posted to: [email protected][email protected][email protected][email protected][email protected]
arrow-up1531arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 month agomessage-square102fedilinkcross-posted to: [email protected][email protected][email protected][email protected][email protected]
minus-squareMiðvikudagur@lemmy.worldlinkfedilinkEnglisharrow-up1·1 month ago“Child pornography” is a term NGO’s and Law enforcement are trying to get phased out. It makes it sound like CSAM is related to porn, when in fact it is simply abuse of a minor.
minus-squareTipsyMcGee@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up1·edit-24 days agodeleted by creator
“Child pornography” is a term NGO’s and Law enforcement are trying to get phased out. It makes it sound like CSAM is related to porn, when in fact it is simply abuse of a minor.
deleted by creator