I know this topic, as well as most online topics, is more about emotions than facts. But here we are. You probably don’t understand how Discord's new policy is intended to help protect children because you aren’t trained to think like a child predator. (That’s fine.) I’ve had to take a lot of child safety trainings over the years for professional reasons. Here’s how online child predators work.
They start by finding a kid with a secret. Just like a lion will generally choose to attack the weak gazelle, child predators go after vulnerable kids.
They find the kids with a secret and say “hey want to see some porn?”, and of course the kid is curious. They didn't start with anything bad. This is a process for them. But they will tell the kid, “be sure you don’t tell your parents about this. This is our secret." Then they slowly try to pull into deeper and deeper secrets and start to blackmail the kid. They start to demand that the kid send them nude photos. They trap the kids into deeper and deeper secrets and guilt to get more and more out of them. In the worst cases this results in meetups with the predator in person.
The easiest places for the predators to start this process is porn sites where the kids are visiting in secret to begin with. Especially those on Discord where the messaging between users is the main feature. Those are the kids that are most vulnerable.
How how is Discord's policy supposed to protect kids? The goal is to keep the vulnerable kids out of spaces where they would be targeted to begin with.
So there you go. I’m all ears for how to do this better. That's one beef I have with the EFF right now. They offer no alternative solutions to this. They just didn't want any kind of protections at all.
I don't agree with that. It's not all on the parents. It can't be all on the parents.
This is like if the Boy Scouts said "Hey, it's not our responsibility to protect kids. The parents should have been more involved." No, if you are providing the service then it's your responsibility to make sure that service is safe.
And yes, I believe you should be held accountable for the services you provide.
The problem with your comparison is that you're physically sending your kids off with other people, and there are physical limitations on participation. Of course it's encumbent on the organization employing those people to make sure they're trustworthy (same for the adults who are volunteering to be responsible for the children). In the same way, Discord is responsible for making sure that their own employees aren't abusing and preying on the users. They are expected to investigate any reports of abuse by either their employees, scouts, or adults volunteering. Just as Discord is expected to investigate any such reports, too. There are absolutely things to hold digital platforms accountable for to make things safer. But face scans and uploading government ID doesn't accomplish that. We hold platforms to account by auditing their response to abuse reports and any failures in their privacy and security controls and if they can't manage that then they should be dissolved.
Even just since I posted my comment, there's now this new link making my front page on piefed
https://www.404media.co/free-tool-says-it-can-bypass-discords-age-verification-check-with-a-3d-model/
Residential IPs aren't static. VPNs exist. Residential ISP proxy services exist. Cloud providers exist from every inhabited continent. Tor exists. Determined predators and bad actors will get on the platforms and can get verified at whatever age group they want.
If we pretend otherwise instead of educating the children on how to recognize predatory behavior then we haven't protected them at all.