this post was submitted on 07 Mar 2026
48 points (94.4% liked)

Privacy

9170 readers
273 users here now

A community for Lemmy users interested in privacy

Rules:

  1. Be civil
  2. No spam posting
  3. Keep posts on-topic
  4. No trolling

founded 2 years ago
MODERATORS
 

Nobody wants to use AI to bug our phones, or to build a sprawling nerve system to track our vitals, because our phones are already bugged. Everything we do on them is recorded a dozen times over, by our wireless carriers, by the websites we visit and the apps we use, by the vendors and ad networks those companies are sending their data to, and in the marketplaces that sell that data. We built the eyes of the Greco decades ago.

But that data has remained relatively secure—or maybe more precisely, its potential energy has remained relatively buried—largely because it’s tedious to work with. It’s messy; it’s scattered across different sources and in different formats; combining it together is a pain, and most of us are simply not interesting enough to investigate. Data analysts who work at shadowy government agencies have lives too, and they do not want to write 595-line SQL queries either.

But AI doesn’t mind. And that’s the boring danger of what happens next: Not of AI becoming a superintelligent Sherlock Holmes finding impossible patterns in its enormous mind palace, but of it being a million monkeys at a million typewriters, doing the grunt work no person wanted to do. Because when prying questions are a prompt away—rather than 24 hours of work away—who wouldn’t get tempted to pry?

you are viewing a single comment's thread
view the rest of the comments
[–] freshcow@lemmy.world 8 points 2 days ago (1 children)

The term you're looking for is "security through obscurity". The effort require to create a coherent picture from that scattered information is more than its worth, so it doesn't get done. The argument here being that AI changes that calculation because it removes the effort part.

[–] gressen@lemmy.zip 0 points 2 days ago

Security through obscurity has been disproven long ago and AI sifting through your obscure data with ease is a great example how this approach doesn't work.