this post was submitted on 13 Feb 2026
-14 points (18.2% liked)

Technology

2238 readers
377 users here now

Tech related news and discussion. Link to anything, it doesn't need to be a news article.

Let's keep the politics and business side of things to a minimum.

Rules

No memes

founded 8 months ago
MODERATORS
 

I know this topic, as well as most online topics, is more about emotions than facts. But here we are. You probably don’t understand how Discord's new policy is intended to help protect children because you aren’t trained to think like a child predator. (That’s fine.) I’ve had to take a lot of child safety trainings over the years for professional reasons. Here’s how online child predators work.

They start by finding a kid with a secret. Just like a lion will generally choose to attack the weak gazelle, child predators go after vulnerable kids.

They find the kids with a secret and say “hey want to see some porn?”, and of course the kid is curious. They didn't start with anything bad. This is a process for them. But they will tell the kid, “be sure you don’t tell your parents about this. This is our secret." Then they slowly try to pull into deeper and deeper secrets and start to blackmail the kid. They start to demand that the kid send them nude photos. They trap the kids into deeper and deeper secrets and guilt to get more and more out of them. In the worst cases this results in meetups with the predator in person.

The easiest places for the predators to start this process is porn sites where the kids are visiting in secret to begin with. Especially those on Discord where the messaging between users is the main feature. Those are the kids that are most vulnerable.

How how is Discord's policy supposed to protect kids? The goal is to keep the vulnerable kids out of spaces where they would be targeted to begin with.

So there you go. I’m all ears for how to do this better. That's one beef I have with the EFF right now. They offer no alternative solutions to this. They just didn't want any kind of protections at all.

you are viewing a single comment's thread
view the rest of the comments
[–] irate944@piefed.social 7 points 5 hours ago* (last edited 4 hours ago)

I'm firmly against what Discord is doing (and what governments like UK, Australia - and soon others - as well).

The main reason is distrust. I do not trust that they - or anyone - would use this data responsibly and only for its intended purpose.

While I do not doubt that these measures could protect more children - I also do not doubt that these measures will be abused. Businesses will violate whatever privacy we still have left in order to get more money from info-brokers/ad-companies, and governments will use it for control. The US has been proving this with ICE, where they've been using Flock to target people.

That's why I always roll my eyes whenever these kind of measures are introduced. They're always introduced with "think of the children!" right beside them.

There's a reason why Apple - years ago - refused to develop a backdoor for iPhones when FBI requested/ordered them to do. There's just no proper way to prevent abuse with backdoors. Yesterday they wanted to check a criminal's phone, tomorrow they may want to target an annoying journalist.

Same principle with this tracking. Once Discord (or any others) can tell that your account belongs to you (IRL entity), there's nothing that you can do to prevent them from abusing that knowledge. Let's assume that today they use this new system for its intended purpose - who's to say that tomorrow they will?

Not mention the data breach discord suffered last year, where around 70k proof of age IDs were leaked. So not only you have to worry about Discord, you also have to worry about others that may get their hands on your info.

Don't get me wrong, we NEED to improve the safety of children on the internet. I fully support doing this via education, improving parental controls, maybe even banning children from social media apps until a certain age, etc.

But abusing our privacy rights is not it.