1441
submitted 1 year ago* (last edited 1 year ago) by driving_crooner@lemmy.eco.br to c/nostupidquestions@lemmy.world

She's almost 70, spend all day watching q-anon style of videos (but in Spanish) and every day she's anguished about something new, last week was asking us to start digging a nuclear shelter because Russia was dropped a nuclear bomb over Ukraine. Before that she was begging us to install reinforced doors because the indigenous population were about to invade the cities and kill everyone with poisonous arrows. I have access to her YouTube account and I'm trying to unsubscribe and report the videos, but the reccomended videos keep feeding her more crazy shit.

you are viewing a single comment's thread
view the rest of the comments
[-] 001100010010@lemmy.dbzer0.com 185 points 1 year ago

I'm a bit disturbed how people's beliefs are literally shaped by an algorithm. Now I'm scared to watch Youtube because I might be inadvertently watching propaganda.

[-] Mikina@programming.dev 53 points 1 year ago* (last edited 1 year ago)

My personal opinion is that it's one of the first large cases of misalignment in ML models. I'm 90% certain that Google and other platforms have been for years already using ML models design for user history and data they have about him as an input, and what videos should they offer to him as an ouput, with the goal to maximize the time he spends watching videos (or on Facebook, etc).

And the models eventually found out that if you radicalize someone, isolate them into a conspiracy that will make him an outsider or a nutjob, and then provide a safe space and an echo-chamber on the platform, be it both facebook or youtube, the will eventually start spending most of the time there.

I think this subject was touched-upon in the Social Dillema movie, but given what is happening in the world and how it seems that the conspiracies and desinformations are getting more and more common and people more radicalized, I'm almost certain that the algorithms are to blame.

[-] Ludrol@szmer.info 16 points 1 year ago

If youtube "Algorithm" is optimizing for watchtime then the most optimal solution is to make people addicted to youtube.

The most scary thing I think is to optimize the reward is not to recommend a good video but to reprogram a human to watch as much as possible

[-] Mikina@programming.dev 7 points 1 year ago

I think that making someone addicted to youtube would be harder, than simply slowly radicalizing them into a shunned echo chamber about a conspiracy theory. Because if you try to make someone addicted to youtube, they will still have an alternative in the real world, friends and families to return to.

But if you radicalize them into something that will make them seem like a nutjob, you don't have to compete with their surroundings - the only place where they understand them is on the youtube.

[-] archomrade@midwest.social 3 points 1 year ago

100% they're using ML, and 100% it found a strategy they didn't anticipate

The scariest part of it, though, is their willingness to continue using it despite the obvious consequences.

I think misalignment is not only likely to happen (for an eventual AGI), but likely to be embraced by the entities deploying them because the consequences may not impact them. Misalignment is relative

[-] MonkCanatella@sh.itjust.works 2 points 1 year ago

fuck, this is dark and almost awesome but not in a good way. I was thinking the fascist funnel was something of a deliberate thing, but may be these engagement algorithms have more to do with it than large shadow actors putting the funnels into place. Then there's the folks who will create any sort of content to game the algorithm and you've got a perfect trifecta of radicalization

[-] floofloof@lemmy.ca 6 points 1 year ago* (last edited 1 year ago)

Fascist movements and cult leaders long ago figured out the secret to engagement: keep people feeling threatened, play on their insecurities, blame others for all the problems in people's lives, use fear and hatred to cut them off from people outside the movement, make them feel like they have found a bunch of new friends, etc. Machine learning systems for optimizing engagement are dealing with the same human psychology, so they discover the same tricks to maximize engagement. Naturally, this leads to YouTube recommendations directing users towards fascist and cult content.

[-] MonkCanatella@sh.itjust.works 2 points 1 year ago

That's interesting. That it's almost a coincidence that fascists and engagement algorithms have similar methods to suck people in.

[-] niktemadur@kbin.social 23 points 1 year ago* (last edited 1 year ago)

You watch this one thing out of curiosity, morbid curiosity, or by accident, and at the slightest poke the goddamned mindless algorithm starts throwing this shit at you.

The algorithm is "weaponized" for who screams the loudest, and I truly believe it started due to myopic incompetence/greed, not political malice. Which doesn't make it any better, as people don't know how to take care of themselves from this bombardment, but the corporations like to pretend that ~~they~~ people can, so they wash their hands for as long as they are able.

Then on top of this, the algorithm has been further weaponized by even more malicious actors who have figured out how to game the system.
That's how toxic meatheads like infowars and joe rogan get a huge bullhorn that reaches millions. "Huh... DMT experiences... sounds interesting", the format is entertaining... and before you know it, you are listening to anti-vax and qanon excrement, your mind starts to normalize the most outlandish things.

EDIT: a word, for clarity

[-] Jaywarbs@kbin.social 3 points 1 year ago

Whenever I end up watching something from a bad channel I always delete it from my watch history, in case that affects my front page too.

[-] Sludgehammer@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

I do that, too.

However I'm convinced that Youtube still has a "suggest list" bound to IP addresses. Quite often I'll have videos that other people in my household have watched suggested to me. While some of it can be explained by similar interests, but it happens a suspiciously often.

[-] Drunemeton@lemmy.world 4 points 1 year ago

I can confirm the IP-based suggestions!

My hubs and I watch very different things. Him: photography equipment reviews, photography how to’s, and old, OLD movies. Me: Pathfinder 2e, quantum field theory/mechanics and Dip Your Car.

Yet we both see stuff in the other’s Suggestions of videos the other recently watched. There’s ZERO chance based on my watch history that without IP-based suggestions YT is going to think I’m interested in watching a Hasselblad DX2 unboxing. Same with him getting PBS Space Time’s suggestions.

[-] emptyother@lemmy.world 2 points 1 year ago

Huh, I tried that. Still got recommended incel-videos for months after watching a moron "discuss" the Captain Marvel movie. Eventually went through and clicked "dont recommend this" on anything that showed on my frontpage, that helped.

[-] static@kbin.social 19 points 1 year ago* (last edited 1 year ago)

My normal YT algorithm was ok, but shorts tries to pull me to the alt-right.
I had to block many channels to get a sane shorts algorythm.

"Do not recommend channel" really helps

[-] AstralPath@lemmy.ca 6 points 1 year ago

It really does help. I've been heavily policing my Youtube feed for years and I can easily see when they make big changes to the algorithm because it tries to force feed me polarizing or lowest common denominator content. Shorts are incredibly quick to smother mebin rage bait and if you so much as linger on one of those videos too long, you're getting a cascade of alt-right bullshit shortly after.

[-] Andreas@feddit.dk 5 points 1 year ago

Using Piped/Invidious/NewPipe/insert your preferred alternative frontend or patched client here (Youtube legal threats are empty, these are still operational) helps even more to show you only the content you have opted in to.

[-] nLuLukna@sh.itjust.works 15 points 1 year ago

Reason and critical thinking is all the more important in this day and age. It's just no longer taught in schools. Some simple key skills like noticing fallacies or analogous reasoning, and you will find that your view on life is far more grounded and harder to shift

[-] Dark_Arc@lemmy.world 15 points 1 year ago

I think it's worth pointing out "no longer" is not a fair assessment since this is regularly an issue with older Americans.

I'm inclined to believe it was never taught in schools, and is probably more likely to be a subject teachers are increasingly likely to want to teach (i.e. if politics didn't enter the classroom it would already be being taugh, and might be in some districts).

The older generations were given catered news their entire lives, only in the last few decades have they had to face a ton of potentially insidious information. The younger generations have had to grow up with it.

A good example is that old people regularly click malicious advertising, fall for scams, etc, they're generally not good at applying critical thinking to a computer, where as younger people (typically though I hear this is regressing some with smartphones) know about this stuff and are used to validating their information (or at least have a better "feel" for what's fishy).

[-] cynar@lemmy.world 9 points 1 year ago

Just be aware that we can ALL be manipulated, the only difference is the method. Right now, most manipulation is on a large scale. This means they focus on what works best for the masses. Unfortunately, modern advances in AI mean that automating custom manipulation is getting a lot easier. That brings us back into the firing line.

I'm personally an Aspie with a scientific background. This makes me fairly immune to a lot of manipulation tactics in widespread use. My mind doesn't react how they expect, and so it doesn't achieve the intended result. I do know however, that my own pressure points are likely particularly vulnerable. I've not had the practice resisting having them pressed.

A solid grounding gives you a good reference, but no more. As individuals, it is down to us to use that reference to resist undue manipulation.

[-] MonkCanatella@sh.itjust.works 3 points 1 year ago

imagine if they taught critical media literacy in schools. of course that would only be critical media literacy with an american propaganda backdoor but still

[-] Redonkulation@lemmy.world 1 points 1 year ago

Texas basically banned critical thinking skills in the school system

[-] jerdle_lemmy@lemmy.world 14 points 1 year ago

I mean, you probably are, especially if it's explicitly political. All I can recommend is CONSTANT VIGILANCE!

[-] Atemu@lemmy.ml 13 points 1 year ago

YouTube's entire business is propaganda: Ads.

[-] 001100010010@lemmy.dbzer0.com 12 points 1 year ago

What ad? Glances at uBlock Origin

[-] martyc3@lemm.ee 2 points 1 year ago

Lately the number of ads on YouTube has increased by an order of magnitude. What they managed to accomplish was driving me away.

[-] Entropywins@kbin.social 12 points 1 year ago

I watch a lot of history, science, philosophy, stand up, jam bands and happy uplifting content... I am very much so feeding my mind lots of goodness and love it...

[-] weeahnn@lemmy.world 7 points 1 year ago

At this point, any channel that I know is either bullshit or annoying af I just block. Out of sight out of mind.

Same. I have ads blocked and open YouTube directly to my subbed channels only. Rarely open the home tab or check related videos because of the amount of click bait and bs.

[-] weeahnn@lemmy.world 4 points 1 year ago

Ohh I just use BlockTube to block channels/ videos I don't want to see.

[-] DaGuys470@kbin.social 7 points 1 year ago

Just this week I stumbled across a new YT channel that seemed to talk about some really interesting science. Almost subscribed, but something seemed fishy. Went on the channel and saw the other videos, immediately got the hell out. Conspiracies and propaganda lurk everywhere and no one is save. Mind you, I'm about to get my bachelor's degree next year, meaning I have received proper scientific education. Yet I almost fell for it.

[-] masquenox@lemmy.world 5 points 1 year ago* (last edited 1 year ago)

I have to clear out my youtube recommendations about once a week... no matter how many times I take out or report all the right-wing garbage, you can bet everything that by the end of the week there will be a Jordan Peterson or PragerU video in there. How are people who aren't savvy to the right-wing's little "culture war" supposed to navigate this?

[-] shortgiraffe@lemmy.world 2 points 1 year ago
[-] masquenox@lemmy.world 1 points 1 year ago

I probably should... but I have to admit that I kinda enjoy reporting them.

Thanks - I'll certainly look into it.

[-] Thorny_Thicket@sopuli.xyz 5 points 1 year ago

I find it interesting how some people have so vastly different experience with YouTube than me. I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I'm interested in. I even watch occasional political videos, gun videos and police bodycam videos but it's still not trying to force any radical stuff down my throat. Not even when I click that button which asks if I want to see content outside my typical feed.

[-] livus@kbin.social 6 points 1 year ago

My youtube is usually ok but the other day I googled an art exhibition on loan from the Tate Gallery, and now youtube is trying to show me Andrew Tate.

[-] Andreas@feddit.dk 5 points 1 year ago

I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I'm interested in.

The algorithm's goal is to get you addicted to Youtube. It has already succeeded. For the rest of us who watch one video a day, if at all, it employs more heavy-handed strategies.

[-] Thorny_Thicket@sopuli.xyz 2 points 1 year ago

That's a good point. They don't care what I watch. They just want me to watch something.

[-] scottyjoe9@sh.itjust.works 5 points 1 year ago

At one point I watched a few videos about marvel films and the negatives about them. One was about how captian marvel wasn't a good hero because she was basically invincible and all powerful etc etc. I started getting more and more suggestions about how bad the new strong female leads in modern films are. Then I started getting content about politically right leaning shit. It started really innocuously and it's hard to figure out if it's leading you a certain way until it gets further along. It really made me think when I'm watching content from new channels. Obviously I've blocked/purged all channels like that and my experience is fine now.

[-] bstix@feddit.dk 2 points 1 year ago* (last edited 1 year ago)

The experience is different because it's not one algorithm for everyone.

Demographics are targeted differently. If you actually get a real feed, it's only because no one has yet paid YouTube for guiding you towards their product.

It would be an interesting experiment to set up two identical devices and then create different Google profiles for each just to watch the algorithm take them in different directions.

[-] abbadon420@lemm.ee 2 points 1 year ago

I don't understand how these people can endure enough ads to be lured in by qanon. The people of that generation generally don't know about decent adblockers.

this post was submitted on 08 Jul 2023
1441 points (97.5% liked)

No Stupid Questions

34888 readers
1710 users here now

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That's it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.



Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here.



Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

founded 1 year ago
MODERATORS