this post was submitted on 07 Mar 2026
48 points (94.4% liked)

Privacy

9170 readers
262 users here now

A community for Lemmy users interested in privacy

Rules:

  1. Be civil
  2. No spam posting
  3. Keep posts on-topic
  4. No trolling

founded 2 years ago
MODERATORS
 

Nobody wants to use AI to bug our phones, or to build a sprawling nerve system to track our vitals, because our phones are already bugged. Everything we do on them is recorded a dozen times over, by our wireless carriers, by the websites we visit and the apps we use, by the vendors and ad networks those companies are sending their data to, and in the marketplaces that sell that data. We built the eyes of the Greco decades ago.

But that data has remained relatively secure—or maybe more precisely, its potential energy has remained relatively buried—largely because it’s tedious to work with. It’s messy; it’s scattered across different sources and in different formats; combining it together is a pain, and most of us are simply not interesting enough to investigate. Data analysts who work at shadowy government agencies have lives too, and they do not want to write 595-line SQL queries either.

But AI doesn’t mind. And that’s the boring danger of what happens next: Not of AI becoming a superintelligent Sherlock Holmes finding impossible patterns in its enormous mind palace, but of it being a million monkeys at a million typewriters, doing the grunt work no person wanted to do. Because when prying questions are a prompt away—rather than 24 hours of work away—who wouldn’t get tempted to pry?

top 6 comments
sorted by: hot top controversial new old
[–] AA5B@lemmy.world 13 points 2 days ago (1 children)

This was likely the case before ai as well. Collect the data, aggregate the data, we’ll find uses for it later.

I actually had this conversation with a startup company in the 2000’s. Their user profile forms were a mess so they were looking for help to fix the, and secure the data. But the root cause is they were collecting a ton of unnecessary data with no validation, verifiability, or constraints

Me: why are you collecting all this data?

Them: we might need it later

Me: so you don’t have a use now and you’re not making any effort to make the data clean enough to be useful. The best fix is to just stop collecting most of this

Them: no

[–] kungen@feddit.nu 2 points 1 day ago

That's also why the NSA and such stores sooo much encrypted data. They most likely don't have the power to break it yet, but they will, and/or utilize when flaws are found.

[–] gressen@lemmy.zip -3 points 2 days ago (2 children)

How can you argue that trivial data somehow implies private data?

[–] freshcow@lemmy.world 8 points 2 days ago (1 children)

The term you're looking for is "security through obscurity". The effort require to create a coherent picture from that scattered information is more than its worth, so it doesn't get done. The argument here being that AI changes that calculation because it removes the effort part.

[–] gressen@lemmy.zip 0 points 2 days ago

Security through obscurity has been disproven long ago and AI sifting through your obscure data with ease is a great example how this approach doesn't work.

[–] Hond@piefed.social 6 points 2 days ago* (last edited 2 days ago)

Trivial looking data can be very relevant and private. Eg i saw a talk how just scraping the names of authors and the time when they publish articles on a news site can be used to make likely conclusions who is taking vacations "together". Thats just two parameters of probably dozens or even hundreds of the data points we leave behind by using tech.