341
Lemmyshitpost community closed until further notice - Lemmy.World
lemmy.worldHello everyone, We unfortunately have to close the !lemmyshitpost community for
the time being. We have been fighting the CSAM (Child Sexual Assault Material)
posts all day but there is nothing we can do because they will just post from
another instance since we changed our registration policy. We keep working on a
solution, we have a few things in the works but that won’t help us now. Thank
you for your understanding and apologies to our users, moderators and admins of
other instances who had to deal with this. Edit: @[email protected]
[https://lemmy.world/u/Striker] the moderator of the affected community made a
post apologizing for what happened. But this could not be stopped even with 10
moderators. And if it wasn’t his community it would have been another one. And
it is clear this could happen on any instance. But we will not give up. We are
lucky to have a very dedicated team and we can hopefully make an announcement
about what’s next very soon. Edit 2: removed that bit about the moderator tools.
That came out a bit harsher than how we meant it. It’s been a long day and
having to deal with this kind of stuff got some of us a bit salty to say the
least. Remember we also had to deal with people posting scat not too long ago so
this isn’t the first time we felt helpless. Anyway, I hope we can announce
something more positive soon.
They also shut down registration
Whoever is spamming CP deserves the woodchipper
Looks like some CSAM fuzzy hashing would go a long way to catch someone trying to submit that kind of content if each uploaded image is scanned.
https://blog.cloudflare.com/the-csam-scanning-tool/
Not saying to go with CloudFlare (just showing how the detection works overall), but some kind of builtin detection system coded into Lemmy that grabs an updated hash table periodically
Not a bad idea, but I was working on a project once that would support user uploaded images and looked into PhotoDNA, but it was an incredible pain in the ass to get access to. I’m surprised that someone hasn’t realized that this should just be free and available. Kind of gross that it is put behind an application/paywall, imo. They’re just hashes and a library to generate the hashes. Why shouldn’t that just be open source and available through the NCMEC?
Putting it behind a 3rd party API that has registration ensures that the 3rd party that is under contract to report it does so. It isn’t enough just to block it - it needs to be reported too. Google and Cloudflare report it to the proper authorities.
Additionally, if it was open source, people trying to evade it could just download the open source tool and tweak their images until they come back without getting flagged.
They could tweak their images regardless. Security through obscurity is never a good solution.
I can understand the reporting requirement.
Works only if your server is hosted in the US