75
submitted 1 year ago by Azzu@lemm.ee to c/privacy@lemmy.ml

And if so, why exactly? It says it's end-to-end encrypted. The metadata isn't. But what is metadata and is it bad that it's not? Are there any other problematic things?

I think I have a few answers for these questions, but I was wondering if anyone else has good answers/explanations/links to share where I can inform myself more.

top 50 comments
sorted by: hot top controversial new old
[-] Oha@lemmy.ohaa.xyz 106 points 1 year ago* (last edited 1 year ago)

It says it's end-to-end encrypted.

Whatsapp is closed source and made by a advertising company. Wouldnt really count on that

Edit: Formatting

[-] folkrav@lemmy.world 25 points 1 year ago* (last edited 1 year ago)

Saying they do E2EE but not doing it would be a literal massive scale fraud. Can't say I put Meta past those behaviors to be fair though lol

But as the other guy said, metadata is already a lot.

[-] BitSound@lemmy.world 26 points 1 year ago

They would just say that they have a different definition of E2EE, or quietly opt you out of it and bury something in their terms of service that says you agree to that. You might even win in court, but that will be a wrist slap years later if at all.

[-] Anticorp@lemmy.ml 10 points 1 year ago* (last edited 1 year ago)

No single individual will beat a corporation as large as Facebook in a court battle. You could have all the evidence in the world and they'll still beat you in court and destroy your life in the process. It took a massive class action lawsuit to hold them accountable for the Cambridge Analytica case, and the punishment was still pennies to them.

Look at the DuPont case. There was abundant evidence that they were knowingly poisoning the planet, and giving people cancer, and they still managed to drag that case on for 30 years before a judgement. In the end they were fined less than 3% of their profit from a single year. That was their punishment for poisoning 99% of all life on planet earth, knowingly killing factory workers, bribing government agencies, lying, cheating, and just all around being evil fucks. 3% of their profit from a single year.

[-] ultratiem@lemmy.ca 19 points 1 year ago

“We just capture what you wrote and to whom before it gets encrypted and sent; we see nothing wrong with that” —Mark Zuckerberg, probably

[-] miss_brainfart@lemmy.ml 15 points 1 year ago* (last edited 1 year ago)

They don't really need the actual contents of your messages if they have the associated metadata, since it is not encrypted, and provides them with plenty of information.

So idk, I honestly don't see why I shouldn't believe them. Don't get me wrong though, I fully support the scepticism.

[-] bouh@lemmy.world 5 points 1 year ago

All they need is the encryption key for the message, and it's not the message itself.

[-] BearOfaTime@lemm.ee 6 points 1 year ago

If they keys are held by them, they have access.

When you log into another device, if all your chat history shows up, then their servers have your encryption key.

load more comments (2 replies)
[-] MiddledAgedGuy@beehaw.org 5 points 1 year ago

This is what I came to express as well. Unless the software is open source, both client and server, what they say is unverifiable and it's safest to assume it's false. Moreover, the owning company has a verifiable and well known history of explicitly acting against user privacy. There is no reason to trust them and every reason not to.

[-] amanneedsamaid@sopuli.xyz 36 points 1 year ago* (last edited 1 year ago)

Metadata is all the content of a message besides the actual text content of the message (i.e. what you type). Examples would be the date and time it is sent, what users these messages were sent to / from, and the IP addresses of both parties. (The availability of metadata varies from messenger to messenger).

I like this example: If you only text your Aunt Sally, who lives in Alaska, twice per year to wish her a happy birthday and Christmas, just by looking at the metadata someone could infer the meaning of your messages, as well as your relationship to the person you're messaging. To a point this is true about any messages you sent.

As for Whatsapp specifically, it being end-to-end doesn't really matter imo, as the application is not open source and is owned by an advertising / social media company. As long as the code is closed source, you cannot be sure:

  1. That your messages are encrypted at all
  2. That your encryption keys are kept on-device, and not plainly available to a centralized party
  3. That the encryption the application is using is securely implemented

At least for applications handling truly sensitive information (for the average person only their messenger and browser), you should be using open source software. The easiest recommendations I can make are:

  1. Browsers: Firefox, Thorium, Brave (disabled all cryptocrap)
  2. Messengers: Signal, SimpleX Chat, XMPP

Anyways, I hope this was a satisfactory answer.

[-] BolexForSoup@kbin.social 18 points 1 year ago

I’m iffy on Brave as a recc but otherwise this comment is fantastic. Hope OP reads it over

load more comments (1 replies)
[-] BraveSirZaphod@kbin.social 3 points 1 year ago* (last edited 1 year ago)

That your messages are encrypted at all
That your encryption keys are kept on-device, and not plainly available to a centralized party
That the encryption the application is using is securely implemented

This is true, but something that should be noted is that, to my knowledge, no law enforcement agency has ever received the supposedly encrypted content of WhatsApp messages. Facebook Messenger messages are not E2E encrypted by default, and there have been several stories about Facebook being served a warrant for message content and providing it. This has, as I understand, not occurred for WhatsApp messages. It is possible, of course, that they do have some kind of access and only provide it to very high-level intelligence agencies, but there's no direct evidence of that.

I would personally say that it's more likely than not that WhatsApp message content is legitimately private, but I'd also agree that you should use something like Signal if you're genuinely concerned about this.

load more comments (6 replies)
[-] Azzu@lemm.ee 2 points 1 year ago

How do I know other browsers/messengers actually include the code that is published when they arrive on my phone? Wouldn't it be possible to simply add tracking/malicious code outside of the open-source repository, build an APK from it and put that on the Play Store instead of the "clean" code on the repository?

[-] amanneedsamaid@sopuli.xyz 5 points 1 year ago

You could compile the software yourself, and the builds they do publish are reproducable, therefore any hidden malicious code would almost certainly be noticed in any popular application.

load more comments (3 replies)
[-] SHITPOSTING_ACCOUNT@feddit.de 27 points 1 year ago

The biggest problem is that it uploads your entire contact list and thus social network to Facebook. That alone tells them a lot about who you are, and crucially, also leaks this information about your friends (whether they use it or not).

With contacts disabled it's a pain to use (last time I tried you couldn't add people or see names, but you could still write to people after they contacted you if you didn't mind them just showing up as a phone number).

It still collects metadata - who you text, when, from which WiFi - which reveals a lot. But if both you and your contact use it properly (backups disabled or e2e encrypted), your messaging content doesn't get leaked by default. They could ship a malicious version and if someone reports your content it gets leaked, of course, but overall, still much better than e.g. telegram which collects all of the above data AND doesn't have useful E2EE (you can enable it but few do, and the crypto is questionable).

[-] detalferous@lemm.ee 23 points 1 year ago

Is Facebook bad for privacy?

Whatsapp is Facebook. Literally. Whatsapp sold themselves to Facebook.

So yes: it's bad for privacy.

[-] shiveyarbles@beehaw.org 23 points 1 year ago

It's owned by Meta, you better forget about privacy lol!

[-] netchami@sh.itjust.works 22 points 1 year ago* (last edited 1 year ago)

TL;DR: Yes it is, it's terrible. What would you expect from a Facebook product? Use Signal instead.

[-] Azzu@lemm.ee 3 points 1 year ago

Thank you, but I'm looking for actual arguments that would sway someone that is trying to come to a rational conclusion. "The reputation of the company is bad" is of course valid evidence, but it would be much more interesting to know what Facebook actually gains from having users on WhatsApp.

[-] netchami@sh.itjust.works 5 points 1 year ago

First, it is very likely that the WhatsApp encryption is compromised, it definitely shouldn't be trusted, as it is completely proprietary and thus not transparent to users and independent auditors. Also, unlike Signal, WhatsApp doesn't encrypt any metadata. The biggest source of WhatsApp user data for Facebook though are address books. When you grant WhatsApp permissions to access your contacts, that data is sent to Facebook servers unencrypted. That way, Facebook can see the names and phone numbers of all of your contacts. This is not just bad for you, it's also bad for everyone whose phone number you saved in your address book, their data is sent to Facebook, even if they don't use any Facebook services themselves. Also, when you have WhatsApp or any app installed on your phone, it by default has access to many things that you can't control or restrict. For example, it can access some unique device identifiers and look at stuff like the list of apps you have installed on your phone or access sensors like the gyroscope and accelerometer which can absolutely be used to track you. It's better to keep shady apps like those made by Facebook, Google, Amazon, Microsoft or other surveillance corporations off your devices. Use FOSS alternatives with a proven track record like Signal if they are available.

load more comments (2 replies)
[-] bouh@lemmy.world 18 points 1 year ago

It might be E2EE but it's not encrypted on your phone and it's closed source. How do you know they don't send the conversation data to their company? How do you know they don't get the encryption keys to decipher the messages for them?

[-] Anticorp@lemmy.ml 5 points 1 year ago

How do you know they don't get the encryption keys to decipher the messages for them?

My guess is that they just capture keywords before you send it. They don't need to read the contents of the sent conversation when both parties to the conversation are using an app they own. They can detect keywords before sending, log and report them, then send the message encrypted. No need to retain encryption keys since they already extracted what they want.

load more comments (2 replies)
[-] Blizzard@lemmy.zip 17 points 1 year ago

Are you really asking about privacy of a Facebook's app?

[-] BolexForSoup@kbin.social 16 points 1 year ago

Are you going to be flippant or help educate the person?

[-] Blizzard@lemmy.zip 10 points 1 year ago

The answer is: Yes, WhatsApp is bad for privacy.

[-] majestictechie@lemmy.fosshost.com 15 points 1 year ago

While the messages itself are encrypted, the WhatsApp App itself can still collect data from you from the Device your using it on:

  • Phone number
  • operating system
  • associated contacts Etc.

And given this is a Meta owned company, we can probably assume they profile you from that.

[-] jjdelc@lemmy.ml 12 points 1 year ago

Your address book is uploaded to Facebook servers when you use Whatsapp. And each time you interact, they know with who and link this information with other profiles and users of the Meta products.

[-] ReversalHatchery@beehaw.org 11 points 1 year ago* (last edited 1 year ago)

It says it's end-to-end encrypted. The metadata isn't. But what is metadata and is it bad that it's not?

It's not just that. Their app can easily have tracking components that look for the list of installed apps, how often you charge your phone, how often are you on a WiFi network, etc.

Also, the app and any tracking component it has can also freely communicate on the wifi network. That doesn't only mean the internet, but the local, home network too, where they can find out (by MAC address, opened ports and response of the corresponding programs) what kind of devices you have, when do you have them powered on, what software you use on it (like do you use any bittorrent client? syncthing? kde connect? lots of other examples?), and if let's say your smart tv publishes your private info on the network, it does not matter that you have blocked LG (just an example) domains in your local dns server, because facebook's apps can just relay it through your phone and then their own servers.

If the app's code has been obfuscated, exodus privacy and others won't be able to detect the tracking components in it.

[-] Azzu@lemm.ee 3 points 1 year ago

Are others different, like Signal and how do I know?

As a normal user I install both in exactly the same way, I have no way to verify that the code of the apk on the play store is exactly the same as the code published by Signal as open-source. How could I trust Signal more?

[-] Devjavu@lemmy.dbzer0.com 3 points 1 year ago

You can only know if you choose to read the code and compile from source. You can trust, in that your read the code and just install the app, or let others read the code for you. If reputable sources tell you it's good, most of the time it's good. How can you trust Signal more? Well you... shouldn't. You could try to use a decompilation tool, don't know if that works on Android's apps though.

[-] ReversalHatchery@beehaw.org 2 points 1 year ago

Are others different, like Signal

Signal's encryption is sound, but there's an uncomfortable fact that it uses google play services dependencies (like for maps and other things, I think). There are articles (1, 2) that discuss that it has functionality that may allow an other process (the google play services process) to read the signal app's state or even directly it's memory because of that, which can mean the contents of the screen or the in-memory cache of decrypted messages.

Security audits often only audit the app's own source code, without the dependencies that it uses.
The google play services dependency could have a "flaw" today, or it could grow a new "feature" one day, allowing what I described above.

May or may not be connected, that Moxie (signal founder) is vehemently against any kinds of forks, including those that just get rid of non-free dependencies (like the google play services dependencies). The other comments of his are also telling.

Because of these, I have ruled for myself that I'll not promote them as a better system, and I'll not install Signal on my phone, because I think it gives a false sense of security, and for other things like still requiring an identity connected identifier (a phone number) for registration.
However if there were people whom I can only reach through Signal, there's Molly. They maintain 2 active forks, one of which is rid of problematic dependencies, and I would probably use that. Molly-FOSS is not published on the official F-droid repository, but they have their own, so the F-droid app can still be used to install it and keep it updated.

and how do I know?

It's hard, unfortunately, and in the end you need to trust a service and the app you use for it.

F-droid apps are auditable, they are forbidden from having non-free (non-auditable) dependencies, and popular apps available in the official repository are usually fine.

With google play, again the truth is uncomfortable.

On Android, the app's signing key (a cryptographic key) makes it possible to verify that the app that you are going to install has not been modified by third parties.
Several years ago Google has mandated that all app developers are required to hand in their signing keys, so that google can sign the apps instead of them, basically impersonating them. Unfortunately this also means that unless the app's total source code is available (along with all the source code of it's dependencies), it's impossible to know if google has done modifications to the app that they make accessible on the google play store. This in itself is already a huge trust issue to me, but what is even worse is that they can just install custom modified versions for certain users on a case by case basis, with the same signing key that once meant that it was not modified by third parties like google, and no one will know it ever.

Just an example to show that the above is possible: the amazon web store similarly also requires the developer to hand over the app's signing key, and they admit in the documentation that they add their own tracking code to every published app.

load more comments (1 replies)
[-] BearOfaTime@lemm.ee 11 points 1 year ago

If you're on Android, the E2E is meaningless as WhatsApp can read what you type, just as the Facebook app can, since they have keyboard access.

I don't know that they do this, just saying it's a leak point, and since it's Meta/Facebook/Zuckerberg, well, let's just say I'm a bit cynical.

[-] eruchitanda@lemmy.world 10 points 1 year ago

That's what they say. ~~Meta~~ Facebook already lied before countless times, so who knows.

[-] eruchitanda@lemmy.world 4 points 1 year ago

(You can google Facebook lawsuits. The number of the results is scary.)

[-] noodlejetski@lemm.ee 6 points 1 year ago
load more comments (2 replies)
[-] crispy_kilt@feddit.de 5 points 1 year ago
[-] Secret300@sh.itjust.works 4 points 1 year ago
[-] American_Jesus@lemm.ee 4 points 1 year ago
load more comments (5 replies)
[-] just_another_person@lemmy.world 3 points 1 year ago

E2E is not equal to Symmetric Encryption, which is the most private "one way" encryption meaning the user controls the data at the origin, and the messages can't be decrypted by anyone else.

WhatsApp is not the latter, so it is not private. Signal is symmetric, for example.

[-] Lojcs@lemm.ee 3 points 1 year ago

Care to elaborate? You can't just imply asymmetric encryption can be decrypted by 3rd parties and not explain how.

Also I don't know how exactly signal works but I know that you don't need to share secrets externally to message someone, so how are they exchanging the symmetric keys without using asymmetric encryption to boot?

load more comments (3 replies)
load more comments
view more: next ›
this post was submitted on 23 Oct 2023
75 points (89.5% liked)

Privacy

32177 readers
1020 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS