this post was submitted on 13 Feb 2026
-13 points (21.7% liked)

Technology

2238 readers
413 users here now

Tech related news and discussion. Link to anything, it doesn't need to be a news article.

Let's keep the politics and business side of things to a minimum.

Rules

No memes

founded 8 months ago
MODERATORS
 

I know this topic, as well as most online topics, is more about emotions than facts. But here we are. You probably don’t understand how Discord's new policy is intended to help protect children because you aren’t trained to think like a child predator. (That’s fine.) I’ve had to take a lot of child safety trainings over the years for professional reasons. Here’s how online child predators work.

They start by finding a kid with a secret. Just like a lion will generally choose to attack the weak gazelle, child predators go after vulnerable kids.

They find the kids with a secret and say “hey want to see some porn?”, and of course the kid is curious. They didn't start with anything bad. This is a process for them. But they will tell the kid, “be sure you don’t tell your parents about this. This is our secret." Then they slowly try to pull into deeper and deeper secrets and start to blackmail the kid. They start to demand that the kid send them nude photos. They trap the kids into deeper and deeper secrets and guilt to get more and more out of them. In the worst cases this results in meetups with the predator in person.

The easiest places for the predators to start this process is porn sites where the kids are visiting in secret to begin with. Especially those on Discord where the messaging between users is the main feature. Those are the kids that are most vulnerable.

How how is Discord's policy supposed to protect kids? The goal is to keep the vulnerable kids out of spaces where they would be targeted to begin with.

So there you go. I’m all ears for how to do this better. That's one beef I have with the EFF right now. They offer no alternative solutions to this. They just didn't want any kind of protections at all.

you are viewing a single comment's thread
view the rest of the comments
[–] Skavau@piefed.social 1 points 7 hours ago (1 children)

The easiest is to keep everything public. Just don’t allow 1-to-1 communication at all.

I think removing private messaging would be very unpopular on here. So that's not going to happen.

[–] 1dalm@lemmings.world 0 points 7 hours ago (1 children)

Probably. So the community needs to figure out how to offer the services they want to have while also protecting children.

[–] Skavau@piefed.social 1 points 7 hours ago (1 children)

So what do you propose then exactly? Private messaging isn't going anywhere.

[–] 1dalm@lemmings.world 0 points 7 hours ago (1 children)

Well, first I would recommend server hosts that "can't afford to protect children" be much more careful who they let onto their personal network.

Second, I would recommend the developer community start training this problem seriously and use the power of the open source development process (which is really good at finding creative solutions to problems) to set this as a development priority.

[–] Skavau@piefed.social 1 points 7 hours ago* (last edited 7 hours ago) (1 children)

I don't think the powers-that-be care about the Fediverse at all bro. Legitimately.

It's far too small. Moreover, it is actually in the terms you refer to - fairly well moderated.

[–] 1dalm@lemmings.world 1 points 6 hours ago (1 children)

I'm only going to say this one more time.

They 100% will care.

[–] Skavau@piefed.social 1 points 6 hours ago

If some unique situation where child porn is shared and the instance its on does nothing about it.

But they don't do that.