- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Also, don’t leave your account unused, delete it. User and follower numbers count.
And least as important, reply (if necessary to another corporate mail address) every email with Twitter/X in the footer, with a kind request to stop promoting and facilitating X.


Removed by mod
Yeah see that’s cool, everyone’s entitled to their opinion. My opinion is that anything that normalizes the sexualization of children should be shamed and shunned.
Removed by mod
Ok so your position is just based on semantics then, because someone used the term “built in child abuse tool” instead of “built in child pornography generator”? Is that really a leg you wanna stand on?
Removed by mod
So you really were arguing that Twitter providing a built-in child porn generator isn’t a valid reason to leave it?
Removed by mod
Ok let’s start over. Ignore the context in the previous comments, pretend I’m asking you this for the first time, with no lead up:
Do you think it’s reasonable to leave Twitter because they provide a tool that can be used to generate child porn?
What the fuck?
Fictional child pornography is still child pornography. This is not just about abuse.
Twitter having a tool to create child pornography is an excellent of reason to quit Twitter.
I get that, but the underlying detail is that for the “AI” to generate that, it most likely already saw it to begin with. Right-wing AI trained on child-porn, weird, i know, but it’s this timeline.
Plus if users are prompting for that… perhaps as a platform you don’t want those users.
Removed by mod
What was it trained on, though?
Even if what it produces is “not technically child abuse”, it was trained on how to make pretend child abuse, emulating what it has already seen. It uses previous child abuse as a guide to make stuff. It may also mix and match the real child abuse to make pretend child abuse. There might be real abused children in those images.
That’s bad, right?
Okay then, pedo John, thanks for the info
A computer program trained off millions of pictures of people that the program’s parent company acquired by hook or by crook and now any photo you posted of your kids is fair game to be recycled into the most heinous shit imaginable at the press of a button by the kind of slug still using Twitter. There is abuse happening there, when they decided to build a machine that could take family photos put online by wellmeaning people and statistically morph them ship-of-Theseus style into pornography with no safeguards.
If I were a parent and even theoretically one pixel of hair on my child’s head were used as aggregate data for this mathematic new form of abuse by proxy, I’d be old testament mad. I would liken it to a kind of theft or even rape that we have no clear word for or legal concept of yet.
I would suggest just not defending this stuff in any way because you’re simply going to lose, or from your perspective, be dogpiled by people having what you perceive to be an extreme moral panic over this issue.
Removed by mod
I understand, albeit in layman’s terms, more or less what LLMs and image generators are doing, and used the ship of Theseus as shorthand referring to the processes by which real photos are laundered into data sets.
I am aware of the literal difference between an individual model and the data it trains on, and understand that Grok and it’s like are divorced from their output. I have even played with running a local model. This level of concern would be unwarranted if humans were decent and only trained Grok on and requested of Grok to generate puppies playing in open fields.
Of course not, it is a picture of hundreds of thousands or maybe millions of people who offered up varying degrees of consent* for the use of their bodies to make any kind of porn.
*Usually 0%, as data illegally scraped is not subject to a hostile TOS agreement if it is uploaded as training data by the scraper without even the knowledge or consent of the original company hosting the data
I think this largely speaks for itself as some of the worst words ever put together in any order generally. Comparing human creativity and how we draw inspiration from our forebears to the present subject is abhorrent. Imagining human minds turning every single speck of human flesh they see into jackoff material because you assume that is how the mind works because you learned it from a robot is beyond everything.
Die on some other hill unless you’re being paid well. This is a neo Nazi’s CSAM and propaganda machine. If they want to fix it, they’ll scrub their training data and figure out the weights and do a massive ban wave on the abusers. It is not on this world to suffer excuses for this hideous fucking bullshit.