I know this topic, as well as most online topics, is more about emotions than facts. But here we are. You probably don’t understand how Discord's new policy is intended to help protect children because you aren’t trained to think like a child predator. (That’s fine.) I’ve had to take a lot of child safety trainings over the years for professional reasons. Here’s how online child predators work.
They start by finding a kid with a secret. Just like a lion will generally choose to attack the weak gazelle, child predators go after vulnerable kids.
They find the kids with a secret and say “hey want to see some porn?”, and of course the kid is curious. They didn't start with anything bad. This is a process for them. But they will tell the kid, “be sure you don’t tell your parents about this. This is our secret." Then they slowly try to pull into deeper and deeper secrets and start to blackmail the kid. They start to demand that the kid send them nude photos. They trap the kids into deeper and deeper secrets and guilt to get more and more out of them. In the worst cases this results in meetups with the predator in person.
The easiest places for the predators to start this process is porn sites where the kids are visiting in secret to begin with. Especially those on Discord where the messaging between users is the main feature. Those are the kids that are most vulnerable.
How how is Discord's policy supposed to protect kids? The goal is to keep the vulnerable kids out of spaces where they would be targeted to begin with.
So there you go. I’m all ears for how to do this better. That's one beef I have with the EFF right now. They offer no alternative solutions to this. They just didn't want any kind of protections at all.
Main issue i personally see is that i cant trust big tech. Ever. Especially not with my biometric data or my government id. If my government would have an online service where i could verify my age and in return get somekind official but anonymized hash/string which confirms my age to 3rd parties i wouldnt mind at all.
Instead the whole world is hellbent on deanonymizing everyone on the net while the political landscape in most countries leans more and more towards authoritarianism. Thats a pretty shitty combination in my book.
With all that said more safeguards for children would be great. But why not start with the inherent issues of our current online services where profit stands above all? Most moderation tools on the biggest services are just algos, "ai" and outsourced workforces in the 3rd world as sub-sub-sub-sub contractors. Sure, you cant moderate hundreds of millions of users without any automatation. But cutting into the profits a bit by employing more actual people would probably help a lot already. Emphasis on probably, idk i'm just an random asshole on the internet.
/Also its not great to start your post telling me that i'm an emotional dumb dumb if you want to convince me. While i dont agree with everything you said there were a few new to me insights which i almost didnt read because you came off as an annoying know-it-all in the very first sentence.
"Main issue i personally see is that i cant trust big tech. Ever."
Me neither. And a big part of the reason why I personally didn't trust them is because they advertise all these "services" to parents and kids and then only provide any sort of child protections when governments force them too.
(You want me to really get heated, get me started on youth sports!)
There's far more reasons to not trust them, but your grievances here would absolutely force all of us onto big tech as all smaller forums and communities would be forced to shut down.
I don't believe that is necessarily correct. I think the fediverse community can figure out a way to police itself, and does so pretty well already. One easy option is that "blocking" people fast and early is generally accepted, and done on a server level. And other is that things are generally more public here, there is less opportunity to pull people into "quite corners" alone.
How can the fediverse police itself here?
Let us know what you propose.
The easiest is to keep everything public. Just don't allow 1-to-1 communication at all.
That would be enough to scare off most predators.
I think removing private messaging would be very unpopular on here. So that's not going to happen.
Probably. So the community needs to figure out how to offer the services they want to have while also protecting children.
So what do you propose then exactly? Private messaging isn't going anywhere.
Well, first I would recommend server hosts that "can't afford to protect children" be much more careful who they let onto their personal network.
Second, I would recommend the developer community start training this problem seriously and use the power of the open source development process (which is really good at finding creative solutions to problems) to set this as a development priority.
I don't think the powers-that-be care about the Fediverse at all bro. Legitimately.
It's far too small. Moreover, it is actually in the terms you refer to - fairly well moderated.
I'm only going to say this one more time.
They 100% will care.
If some unique situation where child porn is shared and the instance its on does nothing about it.
But they don't do that.