this post was submitted on 11 Feb 2026
71 points (98.6% liked)
Privacy
45924 readers
1460 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
much thanks to @gary_host_laptop for the logo design :)
founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The way Bunnings used facial recognition was as private as it could be ... mostly. It is possible to run all of the facial recognition locally, but instead they run it on a central Bunnings-controlled server in Sydney. They only scan for faces of known offenders based on previous entanglements.
My concern isn't surveillance, but mass surveillance. Because this is exclusive to Bunnings it doesn't quite reach mass surveillance, but because the processing is centralized and Bunnings is so big it edges dangerously close.
Still this isn't Flock, nor Palantir, nor Google.
How does the system know who’s an offender and who isn’t without scanning their face first?
They look at CCTV, take an violentcustomer.jpeg, and add it to the database.
EDIT: Oh I see what you're asking. You misread my quote, what I said is "they only scan for" and not "they only scan".