Also, don’t leave your account unused, delete it. User and follower numbers count.

And least as important, reply (if necessary to another corporate mail address) every email with Twitter/X in the footer, with a kind request to stop promoting and facilitating X.

https://bio.link/everyonehateselon

    • zalgotext@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      2 days ago

      Yeah see that’s cool, everyone’s entitled to their opinion. My opinion is that anything that normalizes the sexualization of children should be shamed and shunned.

        • zalgotext@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          2 days ago

          Ok so your position is just based on semantics then, because someone used the term “built in child abuse tool” instead of “built in child pornography generator”? Is that really a leg you wanna stand on?

            • zalgotext@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              2 days ago

              So you really were arguing that Twitter providing a built-in child porn generator isn’t a valid reason to leave it?

                • zalgotext@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  2 days ago

                  Ok let’s start over. Ignore the context in the previous comments, pretend I’m asking you this for the first time, with no lead up:

                  Do you think it’s reasonable to leave Twitter because they provide a tool that can be used to generate child porn?

    • iglou@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      2 days ago

      What the fuck?

      Fictional child pornography is still child pornography. This is not just about abuse.

      Twitter having a tool to create child pornography is an excellent of reason to quit Twitter.

    • 0x0@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      That said, drawing a sexualized image of a fictional child is not child abuse, even if a machine does it. It lack the whole, you know, child being abused part that is kinda central to child abuse.

      I get that, but the underlying detail is that for the “AI” to generate that, it most likely already saw it to begin with. Right-wing AI trained on child-porn, weird, i know, but it’s this timeline.
      Plus if users are prompting for that… perhaps as a platform you don’t want those users.

    • AlreadyDefederated@midwest.social
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 days ago

      What was it trained on, though?

      Even if what it produces is “not technically child abuse”, it was trained on how to make pretend child abuse, emulating what it has already seen. It uses previous child abuse as a guide to make stuff. It may also mix and match the real child abuse to make pretend child abuse. There might be real abused children in those images.

      That’s bad, right?

    • araneae@beehaw.org
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      2 days ago

      A computer program trained off millions of pictures of people that the program’s parent company acquired by hook or by crook and now any photo you posted of your kids is fair game to be recycled into the most heinous shit imaginable at the press of a button by the kind of slug still using Twitter. There is abuse happening there, when they decided to build a machine that could take family photos put online by wellmeaning people and statistically morph them ship-of-Theseus style into pornography with no safeguards.

      If I were a parent and even theoretically one pixel of hair on my child’s head were used as aggregate data for this mathematic new form of abuse by proxy, I’d be old testament mad. I would liken it to a kind of theft or even rape that we have no clear word for or legal concept of yet.

      I would suggest just not defending this stuff in any way because you’re simply going to lose, or from your perspective, be dogpiled by people having what you perceive to be an extreme moral panic over this issue.

        • araneae@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          1 day ago

          I understand, albeit in layman’s terms, more or less what LLMs and image generators are doing, and used the ship of Theseus as shorthand referring to the processes by which real photos are laundered into data sets.

          I am aware of the literal difference between an individual model and the data it trains on, and understand that Grok and it’s like are divorced from their output. I have even played with running a local model. This level of concern would be unwarranted if humans were decent and only trained Grok on and requested of Grok to generate puppies playing in open fields.

          That doesn’t mean any image created by it henceforth is in any meaningful way a picture of you.

          Of course not, it is a picture of hundreds of thousands or maybe millions of people who offered up varying degrees of consent* for the use of their bodies to make any kind of porn.

          *Usually 0%, as data illegally scraped is not subject to a hostile TOS agreement if it is uploaded as training data by the scraper without even the knowledge or consent of the original company hosting the data

          Like anyone who has ever seen a child or depiction of a child producing any sexually explicit illustration of any sort everafter, then? Because even human artists do not create ex nihilo, they start from a foundation of their cumulative experiences, which means anyone who has ever laid eyes on your child or a photo of your child is at some level incorporating some aspect of that experience however small into any artwork they create henceforth and should they ever draw anything explicit…

          I think this largely speaks for itself as some of the worst words ever put together in any order generally. Comparing human creativity and how we draw inspiration from our forebears to the present subject is abhorrent. Imagining human minds turning every single speck of human flesh they see into jackoff material because you assume that is how the mind works because you learned it from a robot is beyond everything.

          Die on some other hill unless you’re being paid well. This is a neo Nazi’s CSAM and propaganda machine. If they want to fix it, they’ll scrub their training data and figure out the weights and do a massive ban wave on the abusers. It is not on this world to suffer excuses for this hideous fucking bullshit.