635
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 21 Jul 2023
635 points (98.6% liked)
Apple
17607 readers
29 users here now
Welcome
to the largest Apple community on Lemmy. This is the place where we talk about everything Apple, from iOS to the exciting upcoming Apple Vision Pro. Feel free to join the discussion!
Rules:
- No NSFW Content
- No Hate Speech or Personal Attacks
- No Ads / Spamming
Self promotion is only allowed in the pinned monthly thread
Communities of Interest:
Apple Hardware
Apple TV
Apple Watch
iPad
iPhone
Mac
Vintage Apple
Apple Software
iOS
iPadOS
macOS
tvOS
watchOS
Shortcuts
Xcode
Community banner courtesy of u/Antsomnia.
founded 2 years ago
MODERATORS
That headline literally says they're not doing that. It was a well-meaning initiative that they rightfully backed down on when called out.
I am one of the first to typically assume malice or profit when a company does something, but I really think Apple was trying to do something good for society in a way that is otherwise as privacy-focused as they could be. They just didn't stop to consider whether or not they should be proactive in legal matters, and when they got reamed by privacy advocates, they decided not to go forward with it.
Good on them for canceling those plans but they only did so because of the massive public outcry. They still intended to start scanning your photos and that is worrying.
However I'm not denying that it's probably still the most privacy focused phone you can get. For now.
They wanted to scan photos stored in iCloud. Apple has an entirely legitimate interest in not storing CSAM on their servers. Instead of doing it like every other photo service does, which scans all of your photos on the server, they created a complex privacy-preserving method to do an initial scan on device as part of the upload process and, through the magic of math, these would only get matched as CSAM on the server if they were confident (one in a trillion false-positives) you were uploading literally dozens of CSAM images, at which point they'd then have a person verify to make absolutely certain, and then finally report your crime.
The system would do the seemingly impossible of preserving the privacy of literally everybody except the people that everyone agrees don't deserve it. If you didn't upload a bunch of CSAM, Apple itself would legitimately never scan your images. The scan happened on device and the match happened in the cloud, and only if there were a enough matches to guarantee confidence. It's honestly brilliant but people freaked out after a relentless FUD campaign, including from people and organizations who absolutely should know better.
Apple proposes change
Users vote against it
Apple doesn’t do change
Nothing to see here folks
I don't quite see it like that myself. If you want to potray yourself as a user privacy focused company then why would you even suggest such feature? Even if their intentions are purely to just protect children with zero malicious future plans they still know it's going to have bad optics and be widely controversial.
How would they know that? It’s often hard to predict how users will react, sometimes your expectations are wrong.
Well, shit. For once the voice of the people worked and you're still bitching about it.
You're right. Maybe I'm being a bit too harsh and should give them some credit. After all they reversed the decision to switch to those shitty butterfly switches on the macbook keyboard too and brought back HDMI and SD card slot. Also ditched that stupid touch bar