75
submitted 11 months ago by LukeSky@lemmy.ml to c/asklemmy@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] neshura@bookwormstory.social 16 points 11 months ago* (last edited 11 months ago)

Also CSAM detection algorithms are known to misfire on occasion (it's hard to impossible to tell apart a picture of a naked child sent for porn purposes and one not send for that) and people want to avoid any false allegations of that if at all possible.

this post was submitted on 26 Dec 2023
75 points (92.1% liked)

Asklemmy

43971 readers
975 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS