Abovethefold@lemmy.ml to Privacy@lemmy.mlEnglish · 7 months agoApple pulls AI image apps from the App Store after learning they could generate nude imagesptv-news.com.pkexternal-linkmessage-square22fedilinkarrow-up191arrow-down16cross-posted to: [email protected]
arrow-up185arrow-down1external-linkApple pulls AI image apps from the App Store after learning they could generate nude imagesptv-news.com.pkAbovethefold@lemmy.ml to Privacy@lemmy.mlEnglish · 7 months agomessage-square22fedilinkcross-posted to: [email protected]
minus-squareFelix@lemmy.mllinkfedilinkarrow-up4arrow-down1·7 months agoLikely not because they aren’t regulated by apple. Don’t take my word though
minus-squareCoasting0942@reddthat.comlinkfedilinkarrow-up2arrow-down2·7 months agoDepends on their legal status. Could they get sued by a victim?
minus-squareFelix@lemmy.mllinkfedilinkarrow-up1arrow-down1·7 months agoThere wouldn’t be a victim, it’s AI.
minus-squareCoasting0942@reddthat.comlinkfedilinkarrow-up3·7 months agoA minor who gets her face turned into porn wouldn’t be able to sue because it’s not photoshop, it’s AI. /s
Likely not because they aren’t regulated by apple. Don’t take my word though
Depends on their legal status. Could they get sued by a victim?
There wouldn’t be a victim, it’s AI.
A minor who gets her face turned into porn wouldn’t be able to sue because it’s not photoshop, it’s AI. /s