Senior UK police officer says AI is accelerating violence against women and girls and that technology companies are complicit

  • P03 Locke@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    1 day ago

    The Guardian: We interviewed a police officer, from an organization most of the public doesn’t trust, and he spouted off his opinion. Since he said the magic word “AI”, we jumped all over it.

    I’d like to know how this is actually “accelerating violence against women and girls”. This is on the level of “video games promotes violence and creates serial killers” panic statements of the 80s and 90s.

  • HubertManne@piefed.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    I can see this. Someone had a comment in another thread that if its for private use its fine and I sorta get that but someone else made the point that ai should not have the info really. So people should really be concerned that ai can have image or video or audio input by users. They should have to use text or non image docs. If someone wants to deepfake and ex they will need to describe them.

    • Ashtear@piefed.social
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      Yes, if it was on a locally-hosted generative model, I wouldn’t be bothered if someone did this in my likeness. That wouldn’t be meaningfully different than using Photoshop to fake it ten years ago.

      Passing it around to their friends and gods know whom else is still just as reprehensible though.

      • HubertManne@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        Yeah and photoshop things had the issue with passing around to. AI does have the issue of input being integrated into other output though. If someone asks for a redhead will it reference redheads various randos inputed into the system. Honestly whats scarier is if someone wants to touch up photos of their family and ai takes that as human images that make sense to use as reference for other porn material requests.

        • Ashtear@piefed.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          That’s why I said local models. They aren’t automatically taking outputs and training updates on them.

          And yeah, we’ve all already had our likenesses folded in somewhere. That’s the bigger problem here.

  • sidebro@lemmy.zip
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    I mean, the majority of deepfakes out there are so bad that I understand that some people don’t care about them. But this stuff keeps getting better with time.

    • Z3k3@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      The bad stuff will never go away so people will think they can spot it while the really good stuff will be used for the dangerous stuff

      And yes I do count myself as someone who will get it wrong. Just hope I can stave off that time for as long as possible

  • Asidonhopo@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    I think the argument will eventually end up in the uncanny valley, like obviously it’s legal to draw a shitty little cartoon of someone engaged in sex but as it becomes closer to photorealism it becomes illegal? Is photoshopping someone’s head onto a porn star illegal? I mainly dont like laws like this because it bring politicians that are ignorant about AI into legislating it on the basis of it being about scandalous sex related stuff, generally to sell more surveillance.