Why are you still leaking all of your private data when it’s so easy not to? Stop using Facebook Messenger, use Signal.

What makes messaging apps secure, and whether Facebook, WhatsApp, Telegram & Signal provide these features.

Laura Pedroni

--

This article is trying to go past sensational keywords to explain in simple terms why privacy matters, to detail which features contribute to make messaging apps secure, what exactly these features are and how they contribute to an app’s security, and to describe whether or not Facebook Messenger, WhatsApp, Telegram and Signal are implementing these features.

Keeping the ownership of our data, and keeping it private, is essential. Yet, most of us have accepted to trade our privacy for the services tech companies provide us.

We feel compelled to use their platforms because we believe there are no other alternatives, or we think that other solutions are too complicated or clunky to setup and use. And when we have a hint that they might exist, we don’t have the time or the expertise to assess them.

But there are a few services that are equally useful as the ones we use every day, and which features and user experiences have nothing to envy to them, whilst safeguarding our privacy.

Our personal conversations are one of the richest sources of personal data so in what follows, we’re going to focus on messaging apps that could replace Facebook Messenger, which leaks data everywhere. I’ll discuss different options and explain why I personally think that Signal is the best choice to replace Facebook Messenger.

If you want to know :

> why your privacy matters, keep reading.

> what features participate in apps’ security, and whether or not Facebook Messenger, WhatsApp, Telegram and Signal are implementing those, jump to the section What makes an app secure, and what does it really mean?

> Why I think that, currently, Signal is the most secure user friendly messaging app, go to ‘Which messaging app to use?

> Why now is a good time to switch to a privacy focused messaging app, jump to “Why switching now?

Why data ownership and privacy matter

Keeping the ownership of our data, and keeping it private, is essential. The subject has been discussed in-depth by a wealth of activists and journalists such as Glenn Greenwald in this TED talk. We’re not going to reiterate in detail these points here, but here is a summary of some of the reasons why privacy matters:

  • We instinctively change our behaviour when we know that we are observed, and we restrict ourselves to what we believe is socially acceptable — we self-censor to avoid negative consequences associated with trespassing social or legal norms. A study shows for instance that after Snowden’s revelations about mass surveillance, Americans’ search of certain keywords on Wikipedia dropped, in correlation with the increased fear to be monitored by the NSA.
  • Privacy is the sphere where behaviours, movements and ideas that are not yet widely accepted, either by society or by the individual themselves, are experimented and played with before being discarded, or before being pursued in the open because they are deemed of enough importance to be broadcasted more widely. Privacy is what precedes innovation, creativity, and civil rights.
  • The lack of privacy, combined with the indefinite storage of information by external entities we don’t control, forbids us the right to be forgotten, and immortalises forever our past personas.
  • Even though we might not always be aware of it, the technology to profile and recognise us is already there. For instance, with a single photo of a person, the Russian app FindFace could — in 2016 — find their social network profile with 70% accuracy, and its B2B page is clear about its applications for security, governments, advertising, business, etc.
  • Even data that doesn’t seem worthy of keeping private might be seen as valuable to those who might want to monitor you. And if not today, their attitude might change in the future. (For example, a recent investigation by Amnesty International suggests that the MET Police, in response to the 2011 London riots, created a file called ‘Gangs Matrix’ that profiles UK residents, some based on the music they listen to and the videos they watch online.)
  • Even though we don’t feel directly threatened by monitoring activities, people we connect with might be, and the collected data of our interactions might be used against them.
  • Activist who campaign for citizen’s rights might be threatened by governments’ monitoring. By collectively protecting our data, we help them to hide their activities as governments can’t single out individuals who are protecting themselves as threats ‘by default’.
  • The “don’t be evil” motto and similar concepts adopted by many tech companies is just what it is — a motto, backed by nothing more than words (Google, its main proponent, has actually just removed the term from its code of conduct). What companies, individuals and decisions makers decide to actually do with our data, now or in the future, might very well contradict it. Furthermore, this implies that we all agree on what ‘being good’ and ‘being evil’ means.
  • Technology progresses fast, and the data we share today might be used tomorrow in ways we cannot yet predict. Future us might thank us for protecting it now. (As an example, simply looking at AI, an obvious field when talking about exploiting large datasets, we’re already capable of making dead speak like living, and robots get appointment for haircuts). We shouldn’t be scared by technology on its own, but we shouldn’t either implicitly trust that the way it will be used in the future won’t affect us negatively.

What makes an app secure, and what does it really mean?

To be secure, a messaging app needs to gather a body of security features. Lacking any of those might undermine dramatically its security claims. We’re discussing below several major features that contribute to app security, and whether or not the major apps in the field (Facebook Messenger, WhatsApp, Telegram and Signal) implement them.

A long-standing relationship.

End-to-End encryption (E2EE)

End-to-end encryption is a method of communication where only the communicating parties can read the messages sent and received. Third-parties like hackers, governments, internet providers, other people on the same wifi, or companies which provide the messaging service cannot read the messages because only the communicating parties have the cryptographic keys necessary to decipher the messages. (Read a more detailed explanation here)

> Who has implemented full end-to-end encryption?

  • Signal, WhatsApp. Both Signal and WhatsApp are using the technology provided by Open Whisper System, the organisation which builds Signal. End-to-end encryption is provided by default to all of their chats.

> Who hasn’t implemented full end-to-end encryption?

  • Facebook. By default, Facebook messages are not encrypted, so Facebook and whoever accesses their servers can read your messages. If you use the Messenger app you can use the ‘Go to Secret Conversation’ feature, which is supposed to create an end-to-end encrypted chat. Have you ever seen any of your contact use it? I haven’t, and when I initiated one, most people quickly switched back to the traditional insecure chat. This feature also works only if you use Messenger chat which is purely mobile, and some people never received my messages when using it.
  • Telegram. Just like Facebook, Telegram doesn’t provide end-to-end encryption to its chats by default. Only special one-to-one conversations started with the button ‘New Secret Chat’ are end-to-end encrypted. This is a big problem because Telegram is very vocal about its high security features, and most users ignore that they need to take extra steps to protect themselves with end-to-end encryption. They might also believe that they are fully secure when using group chat, which they are not.
Only messages started as ‘new Secret Chat’ are end-to-end encrypted on Telegram.
  • To provide security to its non-end-to-end encrypted chats, Telegram explains that messages stored on its servers are encrypted, and that the encryption keys are stored in servers situated in different places in the world. This means that a single government cannot request access via legal means, as they would only be provided one of the many keys needed to decipher the messages. This is already much better than Facebook’s total lack of barrier in case of subpoena. However, it relies on trusting Telegram’s intents, as well as considering that it cannot be pressured into giving away the keys via extra-legal means, and that no-one would succeed to attack its server and take control of them.
  • Additionally, you can only see Secret Chats on the device you started writing / reading them from. So if you started writing a secret chat on your desktop, you’ll only see it when connecting to Telegram on your desktop. As for your contact, they will also only see that chat on the device they started reading it from. Neither of you are notified of these limitations, which means that you might send messages to your contact assuming that they will be capable to read them since they have their phone with them, but since they had started conversing with you while on their desktop, they can’t actually see any of your messages while on the go. This also means that most users who use both the desktop and mobile app quickly revert to non-end-to-end-encrypted chats as they want continuity of conversation instead of having several chats open depending on the devices they use.

No metadata retention

Although the content of our messages might be encrypted, each of them is surrounded with metadata, which gives extra information about us and our activities. For maximum privacy, apps must therefore limit the amount of metadata they record.

> Who limits the recorded metadata?

> Who isn’t clear about the metadata they store?

  • Telegram. After a lot of research I haven’t been able to find out what metadata is collected by Telegram. Their privacy notice indicates that their lawyers are still working on the text to be modified to comply with GDPR, and it is not yet possible to download your user data to check, so hopefully when this happens we should have more details.

> Who records as much metadata as possible?

  • Facebook Messenger and WhatsApp record huge amounts of metadata. WhatsApp privacy policy includes, amongst other things: your profile picture and about information, the group conversations you’re part of, your choice of settings, the time, frequency and duration of your interactions, whether you are online, your IP address and other device and network specific informations, your location information. This information is crossed with the information possessed by other Facebook services and companies using Facebook services.

No insecure backups

It’s great to have secured end-to-end encrypted chats, but if the content of these chats is backed-up somewhere unsafe, or unencrypted, then you could as well just not encrypt anything in the first place.

> Who encrypts their backups?

- Signal. Users can create an encrypted backup on their phone, that they update at will by tapping on a button in ‘Chats and media’. They’ll need the passcode that was generated at the time of the backup creation and access to the folder on their phone or where they’ve stored it if they want to transfer the backup to another device.

- Whatsapp, but ONLY for Iphone users who use iCloud for backup. The files to be backed-up are pre-encrypted before reaching iCloud servers, making them unreadable by Apple or anyone else having access to their server.

> Who half-encrypts their backups ?

- Telegram. Telegram’s secret chats are not backed-up, while default chats and group chats are backed-up on their servers and encrypted. However just like with the rest of their encryption, they own the keys so you have to trust them and trust that they can’t be coerced in giving those keys away.

> Who doesn’t encrypt their backups?

- Facebook Messenger. No encryption there.

- Whatsapp. Whatsapp users using Android can backup their files to Google Drive, but these files are not encrypted prior to being sent to Google Drive. This is a major security threat because even if you chose not to backup your files to Google Drive to avoid this, the contact you’re talking to might be. Anyone gaining access to their Google Drive would then be able to read your conversation, in clear.

Whatsapp backups for Android are stored non-encrypted in Google Drive.

No knowledge of your contact list

In order to show you which of your contacts are using an app, this app must compare your contact list with its list of registered users. Depending on the method used, the app might get hold of your contact list (also called ‘social graph’) and store it on its servers.

> Who doesn’t know who is part of your contact list?

- Signal. Your phone’s contact list isn’t stored on Signal’s server. Instead, the phone numbers of your address book are obfuscated and compared with all the obfuscated phone numbers registered on Signal’s server to see whom of your contact also has Signal and communicate this to you.

> Who knows who’s part of your contact list?

- Facebook Messenger, WhatsApp, Telegram.

Open Source

An open source project is a project which code is publicly available online for anyone to read. This allows other developers to look for potential backdoors or shortcoming in an app’s encryption scheme.

Many articles brandish the term ‘open-source’ as if it was a proof in itself that the software they talk about doesn’t have any backdoor. It isn’t. Open-source isn’t synonymous with untempered. To be coined as such, an open-source project would also have to have a reproducible, verifiable and verified build. Here’s why. Sadly at the moment, depending on the programming language used, and additional libraries added to the project, it’s not always possible to follow these three principles. For instance, Signal has a reproducible build for its Android version, but couldn’t extend it to the external libraries it uses.

> Who is open source?

- Signal. You can find all of their code here.

> Who isn’t open source?

- Facebook Messenger, WhatsApp.

- Telegram. Although a large part of their code is open source (all of the ‘client side’, ie. the user facing side of their code), the server-side code isn’t, which prevents anyone to really know what is happening.

Additional actions to get round censorship and governments’ requests for information

While an app can be very secure, if it is subject to censorship, it might be partially or completely unavailable to some of its users. And if it doesn’t want to be banned, it might decide to comply with governments requests.

> Who is publicly recognised for actively engaging in avoiding censorship / governments’ demands?

- Signal and Telegram. Both apps implemented domain fronting, a technique that hides which platform a device is trying to access, preventing censorship of that request (until both Google and Amazon disabled that feature).

- Last April, Telegram refused to hand over its server’s keys to the Russian government, resulting in a massive wave of censorship in an effort to block the app.

- Signal worked with the American Civil Liberties Union to publicise a data request made by the government to see the data they possessed on two users.

> Who isn’t publicly recognised for opposing censorship / governments?

- Facebook Messenger, Whatsapp.

- Telegram. Edward Snowden criticised Telegram for shutting down a public channel in Iran after the telecommunications minister requested it.

Conclusion: which messaging app to use?

Comparing diverse encrypted services rapidly becomes pretty complicated and technical, so the description above only covers high level concepts and major security points. It also covers these points for now; tech and legal battles evolve fast, and what might be true today for this article might not be tomorrow.

I would highly recommend that you do your own research (and check that the information you find is recent), or if you are in a situation that specifically requires reinforced secrecy, that you talk to an expert. If you’re in a large city, chances are there might even be a cryptoParty happening around the corner, where volunteers will hand you a drink and tools to protect your digital self.

If you’ll get anything from this article, hopefully it is that you shouldn’t use Facebook Messenger. It has no end-to-end encryption, it shares its data with governments, third-party companies, and anyone really who’s ready to pay or could threaten its business model if it didn’t comply.

Should you use WhatsApp instead? The question is rather; do you trust your friends for being concerned about your data privacy? If you do, and are absolutely sure that they won’t backup their data on Google Drive, then WhatsApp isn’t a bad solution, as it provides end-to-end encryption. But if you’re unsure, which you should probably be, then WhatsApp doesn’t actually protect your data.

Now, what about Telegram? If you use Secret Chats, then Telegram can be a good choice. Otherwise, Telegram has the keys to all of your conversations, and you have to trust them to never give them away.

So, what about Signal? Signal provides true end-to-end encryption for all of its conversations, doesn’t know about your phone contact list, and keeps only the minimum metadata about you necessary to function. It’s been praised by recognised names of the privacy and cryptographic world such as Edward Snowden, Laura Poitras and Matt Green. Give it a try. It only takes a few minutes to setup. It works on your phone and your desktop, and you’ll be able to send messages to all of your contacts who use the platform and have shared their phone number with you.

For now, I’d recommend using Signal. But keep in mind technology changes, and new apps with good security features might also emerge, so keep an eye out!

Signal conversation. Borrowed from signal.org

Tip for Signal: If you want to be perfectly secure, you should use the ‘verify safety number’ found on the settings on your friend’s conversation, which allows both of you to compare numbers or QR codes to verify that you are actually talking with them and not an intermediary. Connect with each other via an alternative mean (in person, phone, other messaging app) and verify that the number associated with your conversation is the same. If it changes in the future, this either means that your contact has changed their phone number or that someone else is intercepting their communication and talking with you.

Why switching now?

Today’s public opinion is creating tomorrow’s technological landscape

In the wake of scandals like the Cambridge Analytica files or regulatory changes touching data protection such as the largely advertised GDPR, the new EU data protection regulations, more and more people are aware of the potential damages a lack of true data privacy might cause.

This is the perfect ground to initiate change, and one on which we need to build in order to make data privacy and personal data ownership a must-have in the tech products of the future. If those who build tomorrow’s apps see a business opportunity in people desperate for tools which safeguard their true privacy, then chances are they’re going to build them.

We must initiate the changes we want to see happen

An argument I hear a lot is that ‘a lot of my contact don’t use secure platforms, therefore I cannot switch to one’.

When we talk about social networks and messaging apps this is, after all, a very valid argument. What good does it make me to move to a highly secure conversation platform, if I am there by myself?

But let’s consider WhatsApp for a minute. When I first started using it 5 or 6 years ago, just a handful of people had it. Today, the platform is used by more than 1.5 billion people.

It might sounds cliché to say it, but movements start with individuals.

So go on. Download Signal, and send a link to your friends. Let’s see where we’ll be tomorrow.

The opinions expressed in this article are my own and I am not affiliated with any of the organisations mentioned.

--

--