this post was submitted on 13 Feb 2026
-12 points (20.0% liked)

Technology

2238 readers
370 users here now

Tech related news and discussion. Link to anything, it doesn't need to be a news article.

Let's keep the politics and business side of things to a minimum.

Rules

No memes

founded 8 months ago
MODERATORS
 

I know this topic, as well as most online topics, is more about emotions than facts. But here we are. You probably don’t understand how Discord's new policy is intended to help protect children because you aren’t trained to think like a child predator. (That’s fine.) I’ve had to take a lot of child safety trainings over the years for professional reasons. Here’s how online child predators work.

They start by finding a kid with a secret. Just like a lion will generally choose to attack the weak gazelle, child predators go after vulnerable kids.

They find the kids with a secret and say “hey want to see some porn?”, and of course the kid is curious. They didn't start with anything bad. This is a process for them. But they will tell the kid, “be sure you don’t tell your parents about this. This is our secret." Then they slowly try to pull into deeper and deeper secrets and start to blackmail the kid. They start to demand that the kid send them nude photos. They trap the kids into deeper and deeper secrets and guilt to get more and more out of them. In the worst cases this results in meetups with the predator in person.

The easiest places for the predators to start this process is porn sites where the kids are visiting in secret to begin with. Especially those on Discord where the messaging between users is the main feature. Those are the kids that are most vulnerable.

How how is Discord's policy supposed to protect kids? The goal is to keep the vulnerable kids out of spaces where they would be targeted to begin with.

So there you go. I’m all ears for how to do this better. That's one beef I have with the EFF right now. They offer no alternative solutions to this. They just didn't want any kind of protections at all.

you are viewing a single comment's thread
view the rest of the comments
[–] baronvonj@piefed.social 4 points 1 hour ago (1 children)

When Roblox rolled this out kids were using AI tools to perform the face scan. Kids have easy access to sneak their parents' ID to upload in secret. Predators can use the same tactics. So we haven't gained any security, but some people will be caught up with having their real data made vulnerable.

The only real solution is for the parents/guardians to be engaged and involved in their childrens' lives. That means I have to occasionally invade their privacy and look at their communications, but more importantly it means I have to tell my children about the predators and what kinds of things predators will say to them. It's impossible to child-proof the world around us, we have to world-proof the children themselves.

[–] 1dalm@lemmings.world 1 points 1 hour ago* (last edited 1 hour ago) (1 children)

The only real solution is for the parents/guardians to be engaged and involved in their childrens’ lives.

I don't agree with that. It's not all on the parents. It can't be all on the parents.

This is like if the Boy Scouts said "Hey, it's not our responsibility to protect kids. The parents should have been more involved." No, if you are providing the service then it's your responsibility to make sure that service is safe.

And yes, I believe you should be held accountable for the services you provide.

[–] baronvonj@piefed.social 4 points 1 hour ago

The problem with your comparison is that you're physically sending your kids off with other people, and there are physical limitations on participation. Of course it's encumbent on the organization employing those people to make sure they're trustworthy (same for the adults who are volunteering to be responsible for the children). In the same way, Discord is responsible for making sure that their own employees aren't abusing and preying on the users. They are expected to investigate any reports of abuse by either their employees, scouts, or adults volunteering. Just as Discord is expected to investigate any such reports, too. There are absolutely things to hold digital platforms accountable for to make things safer. But face scans and uploading government ID doesn't accomplish that. We hold platforms to account by auditing their response to abuse reports and any failures in their privacy and security controls and if they can't manage that then they should be dissolved.

Even just since I posted my comment, there's now this new link making my front page on piefed

https://www.404media.co/free-tool-says-it-can-bypass-discords-age-verification-check-with-a-3d-model/

Residential IPs aren't static. VPNs exist. Residential ISP proxy services exist. Cloud providers exist from every inhabited continent. Tor exists. Determined predators and bad actors will get on the platforms and can get verified at whatever age group they want.

If we pretend otherwise instead of educating the children on how to recognize predatory behavior then we haven't protected them at all.