535
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma::More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.

you are viewing a single comment's thread
view the rest of the comments
[-] lloram239@feddit.de 17 points 1 year ago* (last edited 1 year ago)

How much violent content is even out there that you couldn't just trivially block by just collecting a list of content-id? How much of it would you need to watch in full to pass judgement?

I seriously don't get why this is a problem in the first place. Every tiny nip-slip gets you instantly blocked on Facebook and Instagram. They always default to "block" without any closer inspection. They are content moderators after all, not criminal investigators, there shouldn't be a need to watch it in every detail. So why are they watching enough violent videos to cause trauma and not just hitting the block button or let the computer do the work?

[-] some_guy 26 points 1 year ago

Both lawyers agree that Meta's policy of forcing employees to watch the entire video in order to explain all the reasons for censorship aggravates the trauma.

It's in the article.

[-] JasSmith@sh.itjust.works 3 points 1 year ago

The EU now has a rule that all reports of content must be checked and verified for illegal content like misinformation. They can’t automatically block that content because then people would weaponise reports. At best they can automatically block video and image hashes which have been previously verified as illegal, but these are trivial to circumvent. I think they’ve started using perceptual hashes but these are far from perfect.

I believe they use similar moderation for the US to proactively head off potentially similar legislation to the EU.

Something like 3 billion people actively use Facebook each month. There must be tens of millions of daily reports. I can only imagine the level of planning, staffing, and tools which are required to facilitate that.

[-] pinkdrunkenelephants@lemmy.cafe 1 points 1 year ago

They need an AI to curate that kind of content then.

[-] JasSmith@sh.itjust.works 2 points 1 year ago

AI is far from perfect, and is unlikely to satisfy the DSA requirements.

[-] Natanael@slrpnk.net 3 points 1 year ago

It's trivial to circumvent automatic detection

this post was submitted on 20 Oct 2023
535 points (98.2% liked)

Technology

59875 readers
2443 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS