this post was submitted on 11 Sep 2023
1126 points (96.0% liked)

Technology

66231 readers
5471 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://lemmy.world/post/3320637

YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

you are viewing a single comment's thread
view the rest of the comments
[–] Colorcodedresistor@lemm.ee 19 points 2 years ago (1 children)

They blamed books for copy cat killers, movies and video games for shootings now they want to blame websites...

now they are trying to sue people because of hindsight? this isn't Minority Report. this is 'lets throw allot of torts and other legal bs on the wall and pray something sticks'

[–] VonCesaw@lemmy.world 40 points 2 years ago (2 children)

Making legal precedent so that they AVOID showing the offending content instead of PROMOTING the offending content is probably the goal

About 30-40 times a day, Youtube shorts shows me videos actively advocating violence, and I know for sure that Google has enough money and resources currently to prevent these videos being shown, considering it AUTOMATICALLY SUBTITLES THEM

[–] derpgon@programming.dev 24 points 2 years ago

I had to manually report a 100k views short showing someone killing a snail with an air gun. It got removed almost instantly.

Sure, it's a snail, and sure, it's an air gun, but exactly this type of videos are breeding grounds for sickos. And no YouTube, the 1mil sub Minecraft channel that said "kill a creep" is not really violent, neither is some who says "fuck" in the first 30 seconds.

Gosh I hate the platform.

[–] SCB@lemmy.world 3 points 2 years ago (2 children)

What are you searching for that YouTube shorts shows you this?

[–] vaultdweller013@sh.itjust.works 2 points 2 years ago

Dont need to look at a lot, if you remember 2015 youtube suggestions youve got a pretty good idea of how bad the shorts algorithm is. I dont personally use them bur my friend does.

[–] VonCesaw@lemmy.world 1 points 2 years ago

You don't generally search with YouTube shorts, it presents you with content About 20% is YOUR recommendations (channels you subscribe to) 20% is near-interests (people who do similar content) 20% is whatever is popular at the moment, or whatever a la carte foreign language/low effort content (garena free fire, fortnite) it wants to give you 20% is locational (at home I get anime recommendations due to housemate, at work I get Vegas/Disney vacation or AI garbage) 20% is whatever the algorithm is pushing (channels I have BLOCKED but still appear, tiktok exiles, cooking videos)