Sending the Right Signal: Why Switching to Signal Matters

Fabio Tollon
The Startup
Published in
5 min readJan 13, 2021

By James Smith and Fabio Tollon

The recent hype around WhatsApp’s (essentially Facebook’s) new data policy has divided people into two camps: people rushing to find an alternative messaging service that will respect their privacy, and people agreeing to WhatsApp’s terms because “it doesn’t matter, WhatsApp already knows everything about me”. However, choosing Signal over WhatsApp does matter and you should be making the switch sooner rather than later (unless you live in the EU, where this policy is not in effect). Although Telegram collects less data than WhatsApp, it still collects far more than Signal, including IP address and cookie data. While Telegram insist that they will only use this data to operate their services, this could really mean anything. By the end of this article, it will become clear why in this day and age it is important to limit as much data getting into the hands of commercial companies as possible. The central concern here is whether, and to what extent, individual privacy ought to matter with respect to our digital lives. Often such encroachment on our privacy is justified by the promotion of some other value, for example, security. However, it seems WhatsApp has overplayed its hand, and can offer no reasonable justification for their latest policy update.

Recent research in the field of Data Ethics leaves one stunned at how the data generated through online activities is used against the users who provide it. Academic papers are filled with confirmations that it is no longer a case of “I have nothing to hide” and more a case of “look what they made me do”. Big Data is a term that describes the gathering, processing, and analysing of massive volumes of digital data produced by people on social media platforms, websites, and smart devices. Within these massive amounts of information are different kinds of data: location data, likes, times of day online, health data, and income data, just to name a few. Having a variety of data on a person allows for data exploration, a process in which machine learning experiments are run on a dataset in an attempt to discover novel, useful information about an individual such as their preferences, susceptibilities, and the state of their mental and physical health. This allows companies to create a “data double”, a digital version of a person that informs companies of their user’s personal preferences, political leanings, emotional states, and so on (also known as a “psychographic profile”). In the past decade, Facebook has been known to use data doubles to make people depressed, target advertisements at people who feel “worthless”, and generally expand its own revenue at the expense of people’s privacy and general well-being.

Having a variety of data is essential to conducting data exploration. The wider the range of data a company has on an individual the more novel and useful the information revealed in this data can be. This drives Big Data companies to develop products that can capture as wide a variety of data on their users as possible. WhatsApp already captures an incredibly wide variety of data on its users, such as the model of phone you have, your precise location, how you interact with people on WhatsApp, and the times of day you are most active. The updated policy makes this data available to Facebook. This will give Facebook access to even more data with which to create data doubles that could be used to manipulate people’s behaviour, decisions, and opinions with greater accuracy. Given the evidence of Facebook’s past indiscretions it seems unlikely that they will use this power in the best interest of their users. For example, data gathered on WhatsApp could reveal that you support a particular political party because you are on a group with the party’s name in the title, the party’s logo in the group’s profile picture, and your location data revealed that you were present at one of their rallies. Facebook could use this information, in conjunction with the preferences and susceptibilities revealed in your data double, to subtly influence you to support an opposition party by filtering your News Feed to include only positive posts about the opposition party and only negative posts about the party you currently support. This is not a fictional idea, online manipulations like this have been used in the past and are becoming increasingly common.

What emerges from the foregoing discussion is that we live in a data-intensive world, one in which our personal lives and our digital lives are no longer easily separable. It no longer makes sense to of speak technology as separate from ourselves: technology, and especially digital communications technology, is often a mediator in our very experience of the world. Technology is no longer simply “out there”, and this is why privacy matters.

Signal does not gather nearly as much data as WhatsApp, in volume, variety, or sensitivity. The app does not store any metadata, such as when you are online, who you interact with, or the model of your phone. The only personal data Signal stores is your phone number, which it legitimately requires for its service to function. Research shows that it would be impossible for this minute amount of data gathered on Signal to be used for online manipulation. As we enter a more data conscious age, companies should be compelled to build privacy into the design of their products.

Some may object by saying that giving up some of our privacy is justified, as any amount of privacy that we give up “pays” for itself in the form of greater security (or some other value). However, such a view is misguided, as it views our values as entities, out there in the world, waiting to be discovered and readily knowable. However, this is often not how things play out in practice. It is difficult to foresee all expected outcomes, and it is often an open question as to which values are in fact at stake until we are confronted with some or other value conflict. An alternative account sees our values as practices. On this view, certain values should be part of the design process itself and are not seen as external to it. This dynamic view of value allows for change and is broadly pragmatic. Importantly, it also acknowledges that we need to be vigilant with regards to the behaviour of those involved in technological innovation and development. This kind of approach would force our digital overlords to view ethical concerns (such as privacy) not as something that comes after deployment, but as essential to the development process itself. Applied to WhatsApp, it seems as though their latest “innovation” has been the straw that broke the camel’s back.

Replacing WhatsApp with Signal will be a warning to all data greedy companies that when better options become available — and they will — we will all switch in the name of privacy. Such a switch also reveals a perhaps implicit change in our norms surrounding digital technologies. What is currently at stake is not “privacy as we know it”, but rather a future where the very notion of “privacy” loses its force as a fundamental human right. Emerging digital technologies require that we constantly renegotiate what values we are aiming for as a society, but also to critically engage with the specific content of those values. It is not enough to identify values (such as privacy). We must also deliberate on these values, so that we can promote socially equitable outcomes.

--

--