this post was submitted on 09 Mar 2026
73 points (86.9% liked)

Privacy

9170 readers
293 users here now

A community for Lemmy users interested in privacy

Rules:

  1. Be civil
  2. No spam posting
  3. Keep posts on-topic
  4. No trolling

founded 2 years ago
MODERATORS
 

I find it alarming that to "protect" women, men have to be surveilled secretly in all public places. This is way beyond dystopian.

AI and remote security personnel get to decide if someone is "a predator" and take 'em down preemptively if they look suspicious.

What could possibly go wrong?

you are viewing a single comment's thread
view the rest of the comments
[–] horn_e4_beaver@discuss.tchncs.de 12 points 11 hours ago (1 children)

We’re Training Students To Write Worse To Prove They’re Not Robots, And It’s Pushing Them To Use More AI

If students have to use AI in order to make it look like they're not using AI — what on earth will a system like this do to people? Quite how it will be able to read the intent of people's actions without throwing up a huge number of false-positives is something that I don't understand.

And quite what workers are supposed to do when they receive an 'alert' of this nature, I'm not sure. Go up to the individual and tell them that their behaviour has been flagged as suspicious? Way to make me feel more anxious in public.

[–] Mirshe@lemmy.world 3 points 9 hours ago

No, it already does. Facial-ID stuff already throws hundreds of false positives.