- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Im not going to comment on whatever he’s commenting on.
Im just going to re-affirm that Tim Sweeney is a fucking moron. in any and all cases
Just look at that guy. If you were to ask 1000 people to describe what they thought a typical CSAM viewer looked like and averaged their responses together you would get something like this photo of Tim Sweeney.
I bet hes kind of right, here in the UK we just lost a whole bunch of rights and privacies online under the guise of “protect the kids” but its kind of weird to be piping up against it when theres actually protections needed.
It would be weird if it were the only time he’s called out the Google and Apple monopolies and their control over apps, but it’s been a running theme for him (and his legal battles). Two examples from a quick lookup:
- His tweet where he calls out Apple for removing privacy apps at the request of Russia, and for allegedly threatening to remove Twitter in 2024.
- His tweet on Apple removing the Russian social media app VK following the US sanctions related to the Russian invasion of Ukraine in 2022.
Personally, I see no issue with platforms removing content they deem to be problematic, and I’m sure Sweeney agrees, given that the Epic store prohibits pornography for example. However, as he’s said repeatedly, Apple in particular is unique in that it removing an app means there’s practically no way for an iPhone user to access it, since there’s no sideloading.
If it were an app dedicated to CSAM, I don’t think anyone would take issue, but his argument is that removing the app would deplatform all of its 500M users, most of whom are probably not pedos. I’m critical of people being on X, but it’s also undeniable that despite the far-right leaning and CEO, there are still leftists and people belonging to minority groups who are on it, for whatever reason. Are they pedophile and Nazi enablers too? I’m inclined to say yes, but I don’t know how many people would agree.
Edit: Format and details.
This isn’t really a change, though, I’m pretty sure. People have been able to make photo-realistic depictions a lot longer than AI has existed and those have rightfully been held to be illegal in most places because the confusion it causes makes it harder to stop the real thing.
I think the difference here is that Twitter has basically installed a “child porn” button. If their reaction had been to pull the product and install effective safeguards, it wouldn’t be as bad. It’s a serious fuckup, but people screw up every day.
Instead, they’ve made it so you can pay them to have access to the child porn generator.
Its not really a change, so much as its suddenly incredibly easy for anyone of any ability to do it as much as they want with near seamless results.
Every year its got easier and easier to do it more and more believably, but suddenly all you have to do is literally ask the computer and it happens. The line has to be drawn somewhere.
How is he wrong?
What images can I make in Grok that can’t be done with Gemini or GPT?
He’s wrong because he’s not Gabe Newell. On a more serious note, the 404 report cited by the PCGamer article basically supports your point, though with the caveats that X and Musk are bad for other reasons and that those generated images make it into people’s feeds:
The major, uhh, downside here is that people are using Grok for the same reasons they use AI elsewhere, which is to nonconsensually sexualize women and celebrities on the internet […]
The situation on other platforms is better because there are fewer Nazis and because the AI-generated content cannot be created natively in the same feed, but essentially every platform has been polluted with this sort of thing, and the problem is getting worse, not better.
If my political opponents are actually sexual predators and their speech is sexual harassment, I’m down with censoring them. That should be the least of their problems.
That’s not even what gatekeeping means. Unless he’s trying to stand up for the universal right to participate in the child porn fandom.
He wants to use AI in his products but then not be responsible for his products.
Oh that’s an even worse (and probably accurate) interpretation.
“How are we supposed to do business if there are consequences for our actions?!”
bold words from someone who looks like the stock photo for a pedophile.
There’s this old adage, “never attribute to malice that which can be explained by stupidity”.
Tim Sweeney is very ignorant. However, he’s also pretty malicious. His fedoraèd waffling should probably be taken exactly for what it is.
I absolutely hate hanlon’s razor. It is only ever used to try to protect obviously malicious people.
this guy is out of his mind
If there is one gate that definitely needs keeping, it is the kindergarten’s gate. Don’t let those creeps get away with it…
From making CSAM? Makes you wonder about this Sweeny guy.
Did Covid-19 make everyone lose their minds? This isn’t even about being cruel or egotistical. This is just a stupid thing to say. Has the world lost the concept of PR??? Genuinely defending 𝕏 in the year 2026… for Deepfake porn including of minors??? From the Fortnite company guy???
Did Covid-19 make everyone lose their minds?
Every day further convinces me we all died of COVID, and this is The Bad Place.
Unironically this behaviour is just “pivoting to a run for office as a Republican” vibes nowadays.
Its no longer even ‘weird behaviour’ for a US CEO.
For some reason Epic studios just let Tim Sweeney say the most insane things. If I was a shareholder I’d want someone to take his phone off him.
For those who aren’t doing this yet, get in losers, we’re going 0.0.0.0.-ing.
https://steamcommunity.com/sharedfiles/filedetails/?id=2987082604
TIL Tim Sweeney doesn’t know what gatekeeping is
Who else just did a search on the Epstein files for “Tim Sweeney”?
I didn’t find anything on jmail, but there’s still a lot that haven’t been released, and a lot of stuff is still redacted.
Man I just ran into 3 site blockages trying to open this. Somebody REALLY doesn’t want this to be read.








