771
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 28 Aug 2023
771 points (99.5% liked)
Meta (lemm.ee)
3473 readers
2 users here now
lemm.ee Meta
This is a community for discussion about this particular Lemmy instance.
News and updates about lemm.ee will be posted here, so if that's something that interests you, make sure to subscribe!
Rules:
- Support requests belong in !support
- Only posts about topics directly related to lemm.ee are allowed
- If you don't have anything constructive to add, then do not post/comment here. Low effort memes, trolling, etc is not allowed.
- If you are from another instance, you may participate in discussions, but remain respectful. Realize that your comments will inevitably be associated with your instance by many lemm.ee users.
If you're a Discord user, you can also join our Discord server: https://discord.gg/XM9nZwUn9K
Discord is only a back-up channel, !meta@lemm.ee will always be the main place for lemm.ee communications.
If you need help with anything, please post in !support instead.
founded 1 year ago
MODERATORS
I kinda wonder though, how would go about making a law against cp but doesn't hurt small sites like lemm.ee?
The issue is that you really can’t. The laws are written specifically to prevent plausible deniability. Because pedos would be able to go “lol a troll sent it to me” and create some doubt in a jury. Remember that (at least in America) the threshold for conviction is supposed to be “beyond a reasonable doubt.” So if laws were focused on intent, all the pedos would need to do is create reasonable doubt, by arguing that they never intended to view/own the CSAM.
This was particularly popular in the Napster/Limewire days, when trolls would upload CSAM under innocuous titles, so people looking for the newest episode of their favorite show would find CSAM instead. You could literally find CSAM titled things like “Friends S10E9” because trolls were going for the shock factor of an innocent person opening a video only for it to end up being hardcore CSAM. Lots of actual pedos tried using the “I downloaded it by accident” defense.
So instead, the laws are written to close that loophole. It doesn’t matter why you have the CSAM. All that matters is you have it. The feds/courts won’t give a fuck if it was due to you seeking it out or if it was due to a bad actor sending it to you.
And that’s pretty much where we are now. Bad actors creating bot accounts on multiple instances, to spam the larger (most popular) instances with CSAM.
I think they have oversimplified the situation to the point that it is wrong.
Arguably, Lemmy instance providers (depending on where they live) are protected in the same way Facebook or other content hosts are. So long as you are acting in good faith you are protected against any illegal content your users upload. This does mean you need to remove illegal content as you become aware of it, you can't just ignore what your users are doing.
There have been cases where although a user technically 'possessed' CSAM, it was shown that they did so unknowingly via thumbnails or it being cached. The police do investigate where it came from. It's not as simple as just sending it to someone and you can have them convicted.
Yes, you'd just need to show that you actively moderate/apply content policies.
This will vary by jurusduction, but most of the West has laws similar to this I believe.
Lemmy instances are likely already protected in many countries legally so long as they act in good faith, ie actively moderate.