1030
top 50 comments
sorted by: hot top controversial new old
[-] rockSlayer@lemmy.world 118 points 10 months ago

Tell your friend to log the IP address and report it to the authorities. They might need to turn over the entire modlog as well

[-] YIj54yALOJxEsY20eU@lemm.ee 55 points 10 months ago

This is likely in reference to the federation of such images posted elsewhere

[-] db2@sopuli.xyz 76 points 10 months ago

There's always someone who doesn't mind ruining it for everyone else. Probably safest to just delete all the images, that way there's no need to look.

[-] Szymon@lemmy.ca 63 points 10 months ago

Bad actors will try to nuke the entire platform to maintain a monopoly on this format of communication and community.

[-] andrew@lemmy.stuart.fun 35 points 10 months ago

Who could you posspezibly be referring to?

[-] Etienne_Dahu@jlai.lu 3 points 10 months ago

Is it the android? The lone skum? Or someone else entirely?

[-] acastcandream@beehaw.org 65 points 10 months ago

Once again reaffirming why I refuse to host an instance. If I ever do, I’m not federating with any of you degenerates lol

[-] Maajmaaj@lemmy.ca 22 points 10 months ago

Your friend should have restricted account creation.

[-] robotrash@lemmy.robotra.sh 74 points 10 months ago

Federation still causes those images to be saved on your hardware, even if the account that creates it is hosted somewhere else.

[-] Maajmaaj@lemmy.ca 13 points 10 months ago
[-] pinkdrunkenelephants@sopuli.xyz 10 points 10 months ago* (last edited 10 months ago)

It's serious flaw of federation #19865438736 that'll go ignored even when innocent instance admins end up getting jailed over it

[-] PsychedSy@sh.itjust.works 5 points 10 months ago

It's software currently in development so hopefully they'll find alternative ways to handle it.

[-] whofearsthenight@lemm.ee 10 points 10 months ago

This is kinda a major problem with lemmy, and the idea that they don't have CSAM detection on the roadmap is going to make wide adoption a near impossibility. The other thing though is that even automated CSAM detection isn't 100%, so hosting your own instance likely means you're going to have to view CSAM and other fucked up shit at some point to properly moderate it, even if you're just hosting for yourself. Tbh I was strongly considering hosting my own instance because it's not like, that hard/expensive, but this saga has turned me completely off of that idea, even just for myself.

This actually makes me wonder how much reddit mods deal with this type of thing instead of paid employees like facebook, which has a paid army dealing with content moderation on facebook. Oh, and talking about xitter now which has neither volunteer mods and no moderation team since Elon fired them all, I assume that the freaks have just decided that's their hosting platform of choice.

[-] robotrash@lemmy.robotra.sh 4 points 10 months ago

I'll be honest, I'm probably just going to do a scheduled wipe of the pictrs directory of my local instance every week or whatever. I've done them manually a few times and they've had zero affect on my experience.

load more comments (3 replies)
[-] rob64@startrek.website 16 points 10 months ago

I think it was an issue where the CSAM was being copied to servers via normal federation with the instance(s) being spammed.

[-] 01189998819991197253@infosec.pub 20 points 10 months ago

I'm glad s/he was able to nuke the CSAM, even if other material was nuked with it. This crap is why I'm not hosting.

Please, call it CSAM (child sexual abuse material) and not CP (child pornography). The children in these photos/videos can't make pornography, they're sexually abused into making this material. CP insinuates that it's legitimate porn with children. CSAM, on the other hand, calls it what it is: sexual abuse of children.

[-] Tranus@programming.dev 32 points 10 months ago

That is needlessly pedantic. I have never heard of anyone using the word pornography to imply legality or moral acceptability. There is no such thing as "legitimate" CP, so there is no need to specify that it's not ok every time it is mentioned. No one in their right mind would presume he's some kind of CP supporting monster for failing to do so.

[-] TheFrirish@jlai.lu 12 points 10 months ago

If we spent more time fixing things rather than naming them the world would be a better place.

[-] 01189998819991197253@infosec.pub 4 points 10 months ago* (last edited 10 months ago)

No one in their right mind would assume that OP is. But the term was created to legitimize the material. So, while you're correct in that it is picky, it is also picky for a reason. Words are powerful. We should fight to not empower the legitimation of that term, among other things.

load more comments (2 replies)
load more comments (8 replies)
[-] Andrew15_5@mander.xyz 15 points 10 months ago
[-] neeeeDanke@feddit.de 9 points 10 months ago

I know that guy Tobias Fünke, althought he also is a analysist. He had some clever abreviation for that as well!

[-] A10@kerala.party 13 points 10 months ago

Bless you ❤️

[-] pinkdrunkenelephants@sopuli.xyz 12 points 10 months ago

I'm not gonna lie, I'm surprised it took this long for some dipshit to try something like this. Lemmy's security has more holes in it than a piece of Swiss cheese and we're fools if we think it's viable enough for it to serve as a long-term home for new social media.

We really, really need a better social structure than federation.

[-] KairuByte@lemmy.dbzer0.com 16 points 10 months ago

Lemmy’s security has more holes in it than a piece of Swiss cheese

This has very little to do with security. There's inherently "insecure" about posting CSAM, since the accounts and images were likely posted just like any other.

What really needs to happen, is some sort of detection of that kind of content (which would likely require a large change to code) or additional moderation tools.

[-] pinkdrunkenelephants@sopuli.xyz 5 points 10 months ago

The lack of those tools is what I was talking about

[-] KairuByte@lemmy.dbzer0.com 11 points 10 months ago

Ah okay, those arent generally considered security but I can understand why you went that route I suppose.

[-] pinkdrunkenelephants@sopuli.xyz 3 points 10 months ago

Does anyone know why they were never put in?

[-] KairuByte@lemmy.dbzer0.com 6 points 10 months ago

Software development is a balancing act. You need to pick and choose not only what features to add, but when to add them. Sometimes, mistakes are made in the planning and you get a situation like this.

What likely happened, is that these kinds of features were deemed less likely to be needed, since the majority of lemmy users will never run into the need of them and there is technically a way to handle the situation (nuking your instances image cache.) But you'll likely see a reshuffling of priorities if these kinds of attacks become more prevalent.

[-] lemann@lemmy.one 9 points 10 months ago

Lemmy's security

I think you mis-spelled moderation tools, nice quick fix would have been to block posts from new users on X instance and have a pinned post briefly covering why - they'll eventually run out of instances that don't have open signups IMO or just give up.

Another mod tools option would be rate limiting of posts, i.e. users can only make a new shitpost every 10-15min, rather than unlimited times per minute

load more comments (1 replies)
[-] csolisr@communities.azkware.net 10 points 10 months ago

In the meanwhile, my YunoHost based instance that still hasn't managed to make Pict-RS work and therefore can't even store images even if it wanted to is doing juuuuust fine

[-] Etienne_Dahu@jlai.lu 6 points 10 months ago

Come to think of it, if you're the only user, it's kinda protecting you, isn't it? (hello fellow Yunohost user!)

load more comments
view more: next ›
this post was submitted on 28 Aug 2023
1030 points (97.2% liked)

Memes

44147 readers
3224 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS