Senior UK police officer says AI is accelerating violence against women and girls and that technology companies are complicit

  • HubertManne@piefed.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    I can see this. Someone had a comment in another thread that if its for private use its fine and I sorta get that but someone else made the point that ai should not have the info really. So people should really be concerned that ai can have image or video or audio input by users. They should have to use text or non image docs. If someone wants to deepfake and ex they will need to describe them.

    • Ashtear@piefed.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      Yes, if it was on a locally-hosted generative model, I wouldn’t be bothered if someone did this in my likeness. That wouldn’t be meaningfully different than using Photoshop to fake it ten years ago.

      Passing it around to their friends and gods know whom else is still just as reprehensible though.

      • HubertManne@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        Yeah and photoshop things had the issue with passing around to. AI does have the issue of input being integrated into other output though. If someone asks for a redhead will it reference redheads various randos inputed into the system. Honestly whats scarier is if someone wants to touch up photos of their family and ai takes that as human images that make sense to use as reference for other porn material requests.

        • Ashtear@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          That’s why I said local models. They aren’t automatically taking outputs and training updates on them.

          And yeah, we’ve all already had our likenesses folded in somewhere. That’s the bigger problem here.