this post was submitted on 11 Jun 2025
92 points (100.0% liked)

Fediverse vs Disinformation

1365 readers
176 users here now

Pointing out, debunking, and spreading awareness about state- and company-sponsored astroturfing on Lemmy and elsewhere. This includes social media manipulation, propaganda, and disinformation campaigns, among others.

Propaganda and disinformation are a big problem on the internet, and the Fediverse is no exception.

What's the difference between misinformation and disinformation? The inadvertent spread of false information is misinformation. Disinformation is the intentional spread of falsehoods.

By equipping yourself with knowledge of current disinformation campaigns by state actors, corporations and their cheerleaders, you will be better able to identify, report and (hopefully) remove content matching known disinformation campaigns.


Community rules

Same as instance rules, plus:

  1. No disinformation
  2. Posts must be relevant to the topic of astroturfing, propaganda and/or disinformation

Related websites


Matrix chat links

founded 11 months ago
MODERATORS
 

I think we all know by now that major social media platforms in the West are the target of multiple astroturfing and psyop campaigns by both private and state actors.

This post, while obvious in implication, is important as it is the first time I have seen this fact discussed on a major site without receiving a large volume of accusations of conspiratorial thought in recent memory. I think there is also an important meta-discussion to be had regarding our role in combating such campaigns as fediverse denizens.

Obviously, we don't have the manpower to oppose things like this directly. There is additionally the unfortunate reality that we are not as immune here as we might like to think. I personally believe the fediverse likely is subject to similar astroturfing and that to believe otherwise is naive. However, even if there is no major targeting of sites like Lemmy, we are still subject to a trickledown effect from the major social media sites. Popular opinion will be swayed here indirectly by these campaigns regardless of if we are targeted specifically or not.

How can we protect our communities and more importantly our societies?

top 18 comments
sorted by: hot top controversial new old
[–] sharkfucker420@lemmy.ml 25 points 5 days ago (1 children)

How can we protect our communities and more importantly our societies?

Precisely with communities like this imo. Keep up the good work ❤️

[–] pelespirit@sh.itjust.works 17 points 5 days ago (2 children)

Also, know the tactics:

Once we isolate key people, we look for people we know are in their upstream -- people that they read posts from, but who themselves are less influential. (This uses the same social media graph built before.) We then either start flame wars with bots to derail the conversations that are influencing influential people (think nonsense reddit posts about conspiracies that sound like Markov chains of nonsense other people have said), or else send off specific tasks for sockpuppets (changing this wording of an idea here; cause an ideological split there; etc).

The goal is to keep opinions we don't want fragmented and from coalescing in to a single voice for long enough that the memes we do want can, at which points they've gotten a head start on going viral and tend to capture a larger-than-otherwise share of media attention.

(All of the stuff above is basically the "standard" for online PR (usually farmed out to an LLC with a generic name working for the marketing firm contracted by the big firm; deniability is a word frequently said), once you're above a certain size.)

https://archive.is/PoUMo

from Bannon:

“The opposition party is the media,” Steve Bannon, who helped run Trump’s 2016 campaign, told PBS Frontline five years ago. “And the media can only — because they’re dumb and they’re lazy — they can only focus on one thing at a time.”

So the solution, per Bannon? Overwhelm them.

“All we have to do is flood the zone,” he said. “Every day we hit them with three things. They’ll bite on one, and we’ll get all of our stuff done, bang, bang, bang. These guys will never — will never be able to recover. But we’ve got to start with muzzle velocity.”

https://www.npr.org/2025/02/07/nx-s1-5289315/trump-week-in-review

The best defense is to call them out on it and then walk away. They'll downvote the shit out of you, but who tf cares about upvotes and downvotes. If someone is getting downvoted heavily, read what they said carefully before piling on.

[–] Corgana@startrek.website 8 points 5 days ago

The best defense is to call them out on it and then walk away

Yes exactly, I try to just simply describe what they are doing "This account is spreading the false narrative _____ for the purposes of ___" then not replying again. They want engagement because the more back-and-forth bickering that goes on, the less likely a third party reader is going to care to read beyond the top comment (the propaganda) and seeing a lot of replies can also give the impression that the debate is legitimate. Getting into a "debate" with someone "debating" in bad faith only helps them flood the zone with shit.

[–] Maeve@kbin.earth 0 points 5 days ago
[–] Novocirab@feddit.org 10 points 5 days ago* (last edited 5 days ago) (3 children)

Astroturfing on the fediverse will probably take a different form for the time being: Since people here are above-average politically minded, with a robust tendency to the left, the attempts will probably be aimed mostly at distracting, derailing, and sowing discord and doubt. Stifling any nascent initiative, rather than garnering sympathies for anything in particular. Much like described in this post from yesterday.

How to guard against this... It feels to me like it will help vastly if as many of us as possible are also engaged in Matrix/Element channels, i.e. either channels specific to instances, or channels specific to topics (computing, politics, ecology...). Especially those who run popular communities. This would strengthen an implicit "web of trust", in that people will over time build a better impression of whom they're dealing with (after all, it's one thing to publish astroturfing posts, but a different thing to simultaneously entertain semi-personal relationships in a chatroom while never raising doubts about your earnesty). Also, whenever some of us erroneously start to mistrust each other for whatever reason, being in touch over a second channel will give us a better chance at sorting things out before a lasting rift occurs.

[–] Corgana@startrek.website 4 points 5 days ago

Reddit mods can sniff out astroturfing pretty easily actually, but Reddit inc doesn't do much to stop it. On the Fediverse, admins can simply ban from the instance, and if an instance does a poor job of removing inauthentic content then they can defederate.

[–] Kyrgizion@lemmy.world 3 points 5 days ago

I'd say that Lemmy's current userbase is highly reminiscent of early Reddit's userbase (pre 2012ish).

[–] Maeve@kbin.earth 1 points 5 days ago (1 children)
[–] Novocirab@feddit.org 3 points 5 days ago* (last edited 5 days ago) (1 children)

To be clear, what I proposed above doesn't give full protection against targeted false-flag campaigns (what does?). But it does increase their personnel costs for such campaigns to be successful and it gives us a better chance to avoid devouring ourselves out of false suspicions.

[–] Maeve@kbin.earth 2 points 5 days ago (1 children)

Maybe. I was in those chats and paranoia and suspicion abounded before Sabu showed up. Not that that's entirely bad, but it didn't prevent Sabu, just saying.

Eta, I personally am not really interested in participating in chats anymore. Not saying I never would, just that I need more IRL rn.

[–] Novocirab@feddit.org 2 points 5 days ago (1 children)

I'll also add that what I have in mind is discussions about politics and political strategies. If I read you right, the chats you mention were dealing with activities whose legality was at least questionable (in which case heavy paranoia among those involved would probably be inevitable).

[–] Maeve@kbin.earth 3 points 5 days ago

They were largely political. Anything "criminal” discussed in the beginning was how to give regular people more access to information, and real solutions to RL problems. As those channels grew, ideas were necessarily diversified, some more radical, some pretty vanilla.

Power criminalizes anything that may lead to a concession of that power. Something something asked nicely, etc. AND the larger those channels grew, so too did more bad actors with ill intent from the jump, whether LE, political disruptors, or outright chaos goblins.

In short nothing is risk free, but LE is more of a threat than any other bad actors, because protests will be criminalized, mutual aid will be criminalized, reporting will be criminalized, recording, anything. And it already is, defacto if not in writ. But it serves no one to demonize everyone.

[–] Anon518@sh.itjust.works 7 points 5 days ago (1 children)
[–] Novocirab@feddit.org 5 points 5 days ago* (last edited 5 days ago)

And conversely: When feeling an itch to provide the solution to a problem on Reddit, instead re-post the problem along with your solution here on Lemmy. Post a link to it on Reddit, if the presumptive audience on Reddit is one that we'd like to draw over. (This is more about enlivening Lemmy and only indirectly helps to combat disinformation.)

[–] cubism_pitta@lemmy.world 4 points 5 days ago* (last edited 5 days ago) (1 children)

I think thats the challenge.

With federation, obviously the answer is to de-federate from instances allowing such blatant propaganda.

But what about less blatant forms?

Vote manipulation is obviously against the rules (even supposed to be on Reddit). If we ignore the vote manipulation would any Fediverse rules have been broken?

Looks to me like competing communities have started up and the people that started them are REALLY concerned about riots.

Without vote manipulation they would fizzle out fast and be dead communities / sub Reddits.

Their only hope would be to become the Conservative version of other community; fragmenting communities is hard to do and likely wouldn't work in such a granular manner

[–] Corgana@startrek.website 4 points 5 days ago (1 children)

"The fediverse" has no rules, if an instance wants to allow vote manipulation they have that power.

[–] cubism_pitta@lemmy.world 4 points 5 days ago (1 children)

Yeah, I think norms is a better word for it

flooding instances with posts that are blatantly weighted would likely lead to de-federation

That said, the bigger the instance the more power they have to do what they want.

Lemmy.world for instance could put the rest of the Lemmy fediverse between a rock and a hard place if they wanted to

I am sort of talking out my ass; but thats how I understand things as they are right now

We have seen interesting things happen with decentralized systems in the past though.

In IRC land for example Freenode.net went through a hostile takeover which within 24 hours caused operators and admins to jump ship and start their own Freenode (with black jack and hookers) and that seems to have been a success story

https://en.wikipedia.org/wiki/Freenode

[–] Corgana@startrek.website 3 points 5 days ago

Lemmy.world for instance could put the rest of the Lemmy fediverse between a rock and a hard place if they wanted to

beehaw.org is doing great, and they deferated from.world a while ago. Your point is correct though, Mastodon.social for example has half of all Mastodon users.

That said- there is little incentive to having a large instance, it costs a lot more and requires a lot more work.