Also, don’t leave your account unused, delete it. User and follower numbers count.

And least as important, reply (if necessary to another corporate mail address) every email with Twitter/X in the footer, with a kind request to stop promoting and facilitating X.

https://bio.link/everyonehateselon

  • araneae@beehaw.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    5 hours ago

    A computer program trained off millions of pictures of people that the program’s parent company acquired by hook or by crook and now any photo you posted of your kids is fair game to be recycled into the most heinous shit imaginable at the press of a button by the kind of slug still using Twitter. There is abuse happening there, when they decided to build a machine that could take family photos put online by wellmeaning people and statistically morph them ship-of-Theseus style into pornography with no safeguards.

    If I were a parent and even theoretically one pixel of hair on my child’s head were used as aggregate data for this mathematic new form of abuse by proxy, I’d be old testament mad. I would liken it to a kind of theft or even rape that we have no clear word for or legal concept of yet.

    I would suggest just not defending this stuff in any way because you’re simply going to lose, or from your perspective, be dogpiled by people having what you perceive to be an extreme moral panic over this issue.

    • Schadrach@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      3 hours ago

      statistically morph them ship-of-Theseus style

      That’s not really an accurate description of how it works. It doesn’t like have a big database of labelled photos that it looks up and grabs a few that sound similar to what it’s being asked for and then sort of collage those together. It’s basically seen a huge number of photos, been told what those photos are photos of, and from them devised a general model for what those things are, becoming more accurate as it sees more examples and then it gets handed a block of white noise and asked to show how that white noise looks like whatever it’s prompted to make. In painting is a little different in that it takes an existing image instead of white noise

      The training data isn’t part of the model itself (a big hint here should be the existence of LLM or image generation models that are ~10GB in size but were trained on literal terabytes or more of training data - that kind of compression would be absolutely insane and would be used in everything everywhere if the training data were actually part of the model). Several of them are even openly available and can be pretty easily ran locally on consumer hardware.

      …but yeah, somewhere some model saw a photo of you in training and changed a couple of weights somewhere in the network by some tiny fraction, ever so slightly adjusting it’s notions of what people look like and what the other things in that image look like, That doesn’t mean any image created by it henceforth is in any meaningful way a picture of you.

      I would liken it to a kind of theft or even rape that we have no clear word for or legal concept of yet.

      Like anyone who has ever seen a child or depiction of a child producing any sexually explicit illustration of any sort everafter, then? Because even human artists do not create ex nihilo, they start from a foundation of their cumulative experiences, which means anyone who has ever laid eyes on your child or a photo of your child is at some level incorporating some aspect of that experience however small into any artwork they create henceforth and should they ever draw anything explicit…