- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Literally this meme again

What a reprehensible, disingenuous representation of what he actually said. I’m not a fan of the guy, but PC Gamer is trash as well. Scary to see how people here are reacting just because it’s about X and AI.
Yeah nobody in this thread went past the title, but that’s literally not what he said.
He actually said that demanding X remove AI features is gatekeeping since competitors get to keep them, which is still a dumb take but very very far from “Tim Sweeny loves child porn”…
Man that title gives me a stroke trying to decipher it… it almost reads like Tim Sweeney wants Twitter banned but clearly that’s not the case…
steam
does nothing
wins
Did he take an oath against common sense? Is he bound by a curse to have bad takes for his entire life? Does he ragebait as a living? What the actual fuck is up with this man?
Guy atomically made of shit takes has another shit take, colour me surprised.
I think it’s more a:
“Guy who enjoys making child porn on xhitter gets angry when decent people want to ban it.”
type situation.If I see someone arguing for something then that’s because they want it. Or want to use it.
TIL Tim Sweeny is into child porn. Not surprising tbh.
The fall of rome… The fall of the perverse…
I wonder which AI companies he’s invested in
If you can be effectively censored by the banning of a site flooded with CSAM, that’s very much your problem and nobody else’s.
Nothing made-up is CSAM. That is the entire point of the term “CSAM.”
It’s like calling a horror movie murder.
The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content
Were you too busy fapping to read the article?
It’s too hard to tell real CSAM from AI-generated CSAM. Safest to treat it all as CSAM.
I get this and I don’t disagree, but I also hate that AI fully brought back thought crimes as a thing.
I don’t have a better approach or idea, but I really don’t like that simply drawing a certain arrangement of lines and colors is now a crime. I’ve also seen a lot of positive sentiment at applying this to other forms of porn as well, ones less universally hated.
Not supporting this use case at all and on balance I think this is the best option we have, but I do think thought crimes as a concept are just as concerning, especially given the current political climate.
Sure, i think it’s weird to really care about loli or furry or any other niche, but ai generating material of actual children (and unwilling people besides) is actually harmful. If they can’t have effective safeguards against that harm it makes sense to restrict it legally.
You can insist every frame of Bart Simspon’s dick in The Simpsons Movie should be as illegal as photographic evidence of child rape, but that does not make them the same thing. The entire point of the term CSAM is that it’s the actual real evidence of child rape. It is nonsensical to use the term for any other purpose.
The *entire point* of the term CSAM is that it’s the actual real evidence of child rape.
You are completely wrong.
https://rainn.org/get-the-facts-about-csam-child-sexual-abuse-material/what-is-csam/
“CSAM (“see-sam”) refers to any visual content—photos, videos, livestreams, or AI-generated images—that shows a child being sexually abused or exploited.”
“Any content that sexualizes or exploits a child for the viewer’s benefit” <- AI goes here.
RAINN has completely lost the plot by conflating the explicit term for Literal Photographic Evidence Of An Event Where A Child Was Raped with made-up bullshit.
We will inevitably develop some other term like LPEOAEWACWR, and confused idiots will inevitably misuse that to refer to drawings, and it will be the exact same shit I’m complaining about right now.
Dude, you’re the only one who uses that strict definition. Go nuts with your course of prescriptivism but I’m pretty sure it’s a lost cause.
Child pornography (CP), also known as child sexual abuse material (CSAM) and by more informal terms such as kiddie porn, is erotic material that involves or depicts persons under the designated age of majority.
[…]
Laws regarding child pornography generally include sexual images involving prepubescents, pubescent, or post-pubescent minors and computer-generated images that appear to involve them.
(Emphasis mine)‘These several things are illegal, including the real thing and several made-up things.’
Please stop misusing the term that explicitly refers to the the real thing.
‘No.’
deleted by creator
AI CSAM was generated from real CSAM
AI being able to accurately undress kids is a real issue in multiple ways
AI can draw Shrek on the moon.
Do you think it needed real images of that?
It used real images of shrek and the moon to do that. It didnt “invent” or “imagine” either.
The child porn it’s generating is based on literal child porn, if not itself just actual child porn.
You think these billion-dollar companies keep hyper-illegal images around, just to train their hideously expensive models to do the things they do not want those models to do?
Like combining unrelated concepts isn’t the whole fucking point?
No, I think these billion dollar companies are incredibly sloppy about curating the content they steal to train their systems on.
It literally can’t combine unrelated concepts though. Not too long ago there was the issue where one (Dall-E?) couldn’t make a picture of a full glass of wine because every glass of wine it had been trained on was half full, because that’s generally how we prefer to photograph wine. It has no concept of “full” the way actual intelligences do, so it couldn’t connect the dots. It had to be trained on actual full glasses of wine to gain the ability to produce them itself.
And you think it’s short on images of fully naked women?
Yet another CEO who’s super into child porn huh?
Maybe we are the only people that don’t f kids. Maybe this is "H’ “E” Double Hockey Sticks.
I’ll keep in mind Tim think child porn is just politics.
It is when one side of the political palette is “against” it but keeps supporting people who think CSAM is a-okay, while the other side finds it abhorrent regardless who’s pushing it.
I mean the capitalist are the ones calling the shots since the imperial core is no democracy. This is their battle we are their dildos.
Somebody is in a certain set of files
Someone beat this man for attempting to defent AI csam
His opinion is as trash as his gaming storefront that insists its a platform.
inb4 “In a stunning 5-4 decision, the Supreme Court has ruled that AI-generated CSAM is constitutionally protected speech”
There is no such thing as generated CSAM, because the term exists specifically to distinguish anything made-up from photographic evidence of child rape. This term was already developed to stop people from lumping together Simpsons rule 34 with the kind of images you report to the FBI. Please do not make us choose yet another label, which you would also dilute.
Dude, just stop jerking off to kids whether they’re cartoons or not.
‘If you care about child abuse please stop conflating it with cartoons.’
‘Pedo.’
Fuck off.
Generating images of a minor can certainly fulfill the definition of CSAM. It’s a child, It’s sexual, It’s abusive, It’s material. It’s CSAM dude.
These are the images you report to the FBI. Your narrow definition is not the definition. We don’t need to make a separate term because it still impacts the minor even if it’s fake. I say this as a somewhat annoying prescriptivist pedant.
There cannot be material from the sexual abuse of a child if that sexual abuse did not fucking happen. The term does not mean ‘shit what looks like it could be from the abuse of some child I guess.’ It means, state’s evidence of actual crimes.
It is sexual abuse even by your definition if photos of real children get sexualised by AI and land on xitter. And afaik know that is what’s happened. These kids did not consent to have their likeness sexualised.
Nothing done to your likeness is a thing that happened to you.
Do you people not understand reality is different from fiction?
My likeness posted for the world to see in a way i did not consent to is a thing done to me
Your likeness depicted on the moon does not mean you went to the moon.
CSAM is abusive material of a sexual nature of a child. Generated or real, both fit this definition.
CSAM is material… from the sexual abuse… of a child.
Fiction does not count.
You’re the only one using that definition. There is no stipulation that it’s from something that happened.
Where is your definition coming from?
My definition is from what words mean.
We need a term to specifically refer to actual photographs of actual child abuse. What the fuck are we supposed to call that, such that schmucks won’t use the same label to refer to drawings?
How do you think a child would feel after having a pornographic image generated of them and then published on the internet?
Looks like sexual abuse to me.
The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as “evidence of child sexual abuse” that "includes both real and synthetic content














