themachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 month agoA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.coexternal-linkmessage-square102fedilinkarrow-up1550arrow-down119cross-posted to: [email protected][email protected][email protected][email protected][email protected]
arrow-up1531arrow-down1external-linkA Developer Accidentally Found CSAM in AI Data. Google Banned Him For Itwww.404media.cothemachinestops@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 month agomessage-square102fedilinkcross-posted to: [email protected][email protected][email protected][email protected][email protected]
minus-squarebobzer@lemmy.ziplinkfedilinkEnglisharrow-up1arrow-down4·1 month agoWhy say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?
Why say sexual abuse material images, which is grammatically incorrect, instead of sexual abuse images, which is what you mean, and shorter?