This is such a tired story already. Realistically AI could be tremendously helpful but its just almost always abused this way. You can't just prosecute someone because they look like the suspect. You need hard evidence.
this post was submitted on 26 Mar 2026
53 points (96.5% liked)
Videos
18043 readers
176 users here now
For sharing interesting videos from around the Web!
Rules
- Videos only (aside from meta posts flagged with [META])
- Follow the global Mastodon.World rules and the Lemmy.World TOS while posting and commenting.
- Don't be a jerk
- No advertising
- No political videos, post those to !politicalvideos@lemmy.world instead.
- Avoid clickbait titles. (Tip: Use dearrow)
- Link directly to the video source and not for example an embedded video in an article or tracked sharing link.
- Duplicate posts may be removed
- AI generated content must be tagged with "[AI] …" ^Discussion^
Note: bans may apply to both !videos@lemmy.world and !politicalvideos@lemmy.world
founded 2 years ago
MODERATORS
Exactly. Could be useful for confirming the person. Like "we found this this and this and the scene, and we believe to to be x person" AI can confirm some of that like "its a 80% match" and so they can say good, we think we're arresting the right person.
It should NEVER EVER be used as the sole reasoning. But lazy incompetence cops will of course use it that way, which is why it should be banned from them. I do not trust that "a few bad apples" will avoid using it.
This sounds like it requires some work.
And she’ll probably get nothing for her misery since they’ll claim qualified immunity.