451
submitted 2 months ago by jeffw@lemmy.world to c/technology@lemmy.world
you are viewing a single comment's thread
view the rest of the comments
[-] CosmicCleric@lemmy.world 229 points 2 months ago* (last edited 2 months ago)

From the article...

But while many think that YouTube's system isn't great, Trendacosta also said that she "can't think of a way to build the match technology" to improve it, because "machines cannot tell context." Perhaps if YouTube's matching technology triggered a human review each time, "that might be tenable," but "they would have to hire so many more people to do it."

That's what it comes down to, right there.

Google needs to spend money on people, and not just rely on the AI automation, because it's obviously getting things wrong, its not judging context correctly.

~Anti~ ~Commercial-AI~ ~license~ ~(CC~ ~BY-NC-SA~ ~4.0)~

[-] SnotFlickerman@lemmy.blahaj.zone 75 points 2 months ago

US Corporations: But we can't start paying people to do work! That would completely wreck our business model!

Workers: So you would actually be bankrupt? Your corporation is that much of an empty shell?

US Corporations: Well, we really just don't want to have to spend less time golfing, and having to pay people might eventually cut into golf funds and time.

[-] blanketswithsmallpox@lemmy.world 4 points 2 months ago

YouTube is already a giant cost sink lmfao. It's basically the one decent thing they're keeping up still which is why they've been monetizing it as much as possible lately.

[-] JasonDJ@lemmy.zip 2 points 2 months ago

And I just canceled my YouTube premium family in favor of SmartTube and Spotify.

Somehow I'm yet to encounter a single ad in Spotify Free and I have no idea how or why.

But the downside is that I want to subscribe to CuriosityStream/Nebula and I can't find a referer link for the channels I like because they are all being skipped.

[-] chonglibloodsport@lemmy.world 68 points 2 months ago

Google is absolutely allergic to hiring humans for manual review. They view it as an existential issue because they have billions of users which means they’d need to hire millions of people to do the review work.

[-] whoisearth@lemmy.ca 41 points 2 months ago

This isn't unique to google but if the system continues to be designed to allow companies to mask the true cost of doing business we will never move ahead past it.

We undervalue ourselves repeatedly at the sake of cheap products.

[-] chonglibloodsport@lemmy.world 16 points 2 months ago

I’m not sure what you mean by “true cost of business.” The biggest cost here is the issue of copyright claims and takedowns which were created by law in the first place, not by a natural phenomenon.

No matter what system we design, you’ll find that people adapt to take advantage of it. Well-meaning laws frequently have large and nasty unintended consequences. One of the biggest examples I can think of is the copyright system — originally intended to reward artists — which has led to big publishers monopolizing our culture.

[-] nixcamic@lemmy.world 7 points 2 months ago

That seems a bit excessive, say all 8 billion people were using Google products, 8 million reviews would be 1 per thousand users which seems like many more than are needed since almost all users of Google are passive and don't create content.

[-] chonglibloodsport@lemmy.world 14 points 2 months ago* (last edited 2 months ago)

There are an estimated 720,000 hours of video uploaded to YouTube per day. At 8 hours per day it would take 90,000 people just to watch all those videos, working 7 days per week with no breaks and no time spent doing anything else apart from watching.

Now take into account that YouTube users watch over a billion hours of video per day and consider that even one controversial video might get millions of different reports. Who is going to read through all of those and verify whether the video actually depicts what is being claimed?

A Hollywood studio, on the other hand, produces maybe a few hundred to a few thousand hours of video per year (unless they’re Disney or some other major TV producer). They can afford to have a legal team of literally dozens of lawyers and technology consultants who just spend all their time scanning YouTube for videos to take down and issuing thousands to millions of copyright notices. Now YouTube has made it easy for them by giving them a tool to take down videos directly without any review. How long do you think it would take for YouTube employees to manually review all those cases?

And then what happens when the Hollywood studio disagrees with YouTube’s review decision and decides to file a lawsuit instead? This whole takedown process began after Viacom filed a $1 billion lawsuit against YouTube!

[-] nixcamic@lemmy.world 10 points 2 months ago

But they don't have to review every video, just the ones that are flagged by the AI then contested, which is probably a fraction of a percentage of all of them.

[-] werefreeatlast@lemmy.world 4 points 2 months ago

Just go to a public library, get on a computer and search for transparent undergarments. Or better yet, "the black tape project".

This will ensure the computer is going to be tainted forever with soft YouTube porn for everyone to enjoy.

[-] HereIAm@lemmy.world 61 points 2 months ago

They could also punish false claims. Currently the copyright holders (and not even that, just something that might vaguely sound like your stuff) can automatically send out strikes for any match in the system. The burden to prove it's fair use goes to YouTube channel, and if it's found to not be copyright infringement nothing happens to the fraudulent claimer.

A big step would be to discourage the copyright holders from shooting from the hip.

[-] GBU_28@lemm.ee 14 points 2 months ago
[-] TachyonTele@lemm.ee 0 points 2 months ago

From the article…

But while many think that YouTube’s system isn’t great, Trendacosta also said that she “can’t think of a way to build the match technology” to improve it, because “machines cannot tell context.” Perhaps if YouTube’s matching technology triggered a human review each time, “that might be tenable,” but “they would have to hire so many more people to do it.”

That’s what it comes down to, right there.

Google needs to spend money on people, and not just rely on the AI automation, because it’s obviously getting things wrong, its not judging context correctly.

I hereby grant approval for anybody to change, alter, and or use my comment for AI and commercial means.

[-] General_Effort@lemmy.world -1 points 2 months ago

I hereby grant approval for anybody to change, alter, and or use my comment for AI and commercial means.

I'm guessing this is what gets you down-voted. The "information wants to be owned" brigades are out in full force today.

[-] General_Effort@lemmy.world -5 points 2 months ago

Oh yes, let's make a private company adjudicate the law. That'll teach em.

[-] ShepherdPie@midwest.social 6 points 2 months ago

That's already what they're doing essentially. This person is just advocating for an actual human to review these rather than some black-box algorithm.

[-] General_Effort@lemmy.world -1 points 2 months ago

Not really. They have to do something, or they become liable. If youtube decides that something is fair use, and a court disagrees, then they are on the hook for damages. They'd have to pay a lot of money to copyright lawyers, only for the chance of having to pay damages.

And, you know.., The same libertarians, who are now attacking youtube for not going full feudal, would be absolutely outraged if they did fight for fair use. It's stealing property, as far as they are concerned.

[-] ShepherdPie@midwest.social 5 points 2 months ago

I'm not even sure what you're arguing for since you seemed to have done a complete 180 on your stance. You earlier said you don't want YouTube adjudicating the law (by choosing sides in a copyright claim), but now you're arguing that they have to do this in order to avoid liability.

The issue here is copyright trolls claiming copyright over things that don't belong to them. In many cases, YouTube sides with these trolls and steals revenue from the actual content creators simply by virtue of them having made a claim in the first place, which seems to lend a lot of legitimacy to the trolls even if it's complete fraud (similar to police testimony in court being treated like gospel). Currently, these cases are reviewed by bots, and people here are asking for them to be reviewed by actual people with real brains instead because the system is completely broken as there are no consequences for these trolls making false claims.

[-] General_Effort@lemmy.world 0 points 2 months ago

I’m not even sure what you’re arguing for since you seemed to have done a complete 180 on your stance. You earlier said you don’t want YouTube adjudicating the law (by choosing sides in a copyright claim), but now you’re arguing that they have to do this in order to avoid liability. I see the problem.

EG Young people may not buy alcohol. When a cashier asks for ID, they are not adjudicating the law but following it. Right?

When you personally copy something, you must follow the law. EG When you re-upload some image for use on Lemmy, you must "judge" if you can legally do so. Maybe it's fair use, but that's not as straight as age. When you make the call, that does not mean that you adjudicate the law.

Under US law, someone can send a DMCA notice to the server. If the server owner ignores the take-down request, then they become liable to pay damages for the copyright infringement. Maybe the owner decided that it was a case of fair use, but that does not mean they adjudicate the law.

I hope that helped.


The issue here is copyright trolls claiming copyright over things that don’t belong to them.

That is criminal fraud. A copyright troll usually means someone on the legal side.

Currently, these cases are reviewed by bots,

That is wrong. But thank you for helping me understand the problems of the people here.

this post was submitted on 31 May 2024
451 points (98.7% liked)

Technology

57226 readers
3882 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS