Hello everyone,
We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.
We keep working on a solution, we have a few things in the works but that won't help us now.
Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.
Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn't his community it would have been another one. And it is clear this could happen on any instance.
But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what's next very soon.
Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It's been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn't the first time we felt helpless.
Anyway, I hope we can announce something more positive soon.
This whole situation is shitty all around, I was really hoping it would be an isolated incident and that they wouldn't come back. I think I speak for everyone when I say that the bastards posting CSAM need to be jailed or worse. Disagree with an instance or community all you want, the instant you pull something like this, you've lost every single argument and are irredeemably a horrible person not worthy of touching a computer ever again.
Once again pedophiles ruin everything nice. I know me saying this isn't that helpful since I can't do anything about it, but I'm sorry this is happening on your instance (and is getting federated to other instances). Don't worry about inconvencing regular users, taking action against CSAM is far more important and any half amicable user will understand.
Some resources for reporting CSAM if you come across it anywhere (not just Lemmy):
US: https://www.missingkids.org/cybertipline
Canada: https://cybertip.ca/app/en/
International: https://www.inhope.org/
Last but not least, a reminder that if you accidentally load CSAM on your device, in most cases it will get cached to your storage because the majority of apps and browsers employ disk caching by default. You should at the very least clear your caches and then trim and fully wipe the free space on your device (maybe also directly shred the actual files if you can do that/know how to). Also know if you have any mandatory reporting laws where you live and comply with them. (EDIT: another commenter mentioned that in some jurisdictions you might actually not be allowed to delete them immediately and, presumably, have to contact police immediately reporting the active file on your device.) CYA to prevent yourself from getting screwed because of someone else's horrible acts. Also, something I've been thinking about since this whole thing started: it might also be helpful to use a no-disk-cache browser/app (or disable disk caching on your current browser/app if you are able to) if you do wish to keep using Lemmy, at least until this whole thing blows over, that way you can just close the page/program and reboot your device, and the local version should be gone, especially since flash storage devices cannot be reliably wiped with the "fill the drive with blank data" method (not sure how big the risk of it ending up in the swapfile or otherwise sticking around though or at what point it stops counting as possession). Being exposed to CSAM is a nightmare for this reason and unfortunately there seem to be no good resources on what to do if you're exposed.
I am not a lawyer and no part of this comment is legal advice.
If that's the case then I stand corrected, again, not a lawyer and not giving any legal advice. I'm in Canada and I need to check my local laws as well, seems like a really important thing to know in general for anyone that uses the internet. I also think governments pretty much across the board need to do a way better job of actually educating people on exactly what to do if they are accidentally exposed to CSAM, because it seems like those resources are basically nonexistent.