Is It Paranoid to Use Encrypted Messaging?

Omer Akgul
Sparks of Innovation: Stories from the HCIL
4 min readMay 27, 2020

Understanding current security practices and challenges in instant messaging.

Photo by Charles Deluvio on Unsplash.

Instant messaging or chat apps — like Whatsapp, iMessage, WeChat, Facebook messenger, Telegram and many others — are a convenient and popular way for people around the world to keep in touch. But users may not always understand whether communications sent via these apps are secure and private.

In recent years, privacy in messaging apps has become something of a hot topic. Some chat apps, such as Signal and Threema, are designed and marketed specifically for secure communications. Mainstream messaging apps have followed suit by adding on-by-default security (WhatsApp and iMessage) or providing opt-in settings for security (Telegram, Facebook Messenger, and Skype). All of these apps use or provide the option for end-to-end encryption (E2EE), a technical measure designed to ensure that messages are not readable by anyone who intercepts them mid-communication, including the app company that transmits the message. E2EE, while not perfect, provides significant security (and therefore privacy) advantages over other strategies for securing communications, and is generally considered by security experts to be the best choice.

In many ways, it’s wonderful that secure communication is now available to the masses via these kinds of popular tools. However, recent research demonstrates that users who have secure communication tools often don’t know about, or misunderstand, these security properties in ways that inhibit confident and correct use of the technology. In fact, some users, erroneously thinking that it’s more secure, prefer significantly less secure tools(such as SMS) to send private or confidential messages. Other research demonstrates that adoption of secure messaging tools is driven primarily by popularity within a user’s social network, not by awareness of or desire for security features. These findings raise several key questions: how can we improve users’ ability to choose secure messaging when needed, and how can we increase overall adoption so that the baseline of communications in many social networks is secure and private?

Fifteen years ago, when encrypted communication was much less ubiquitous, researchers at Princeton talked to employees at an NGO who used encrypted email for sensitive communications. These employees believed that encryption was only useful for very secret, highly important communications (secrecy). People who used encrypted email too frequently were seen as suspicious or paranoid (paranoia), as well as potentially annoying when misleading recipients about urgency or even attracting attention to otherwise unremarkable communications (flagging).

In this work (done in collaboration with Ruba Abu-Salma, Wei Bai, Michelle Mazurek, Elissa Redmiles, and Blase Ur), we set out to investigate whether, 15 years and many generations of technology later, these perceptions of encryption communication as furtive and often undesirable still persist, potentially inhibiting adoption. We were also interested in understanding how different ways of presenting the security features of a messaging app would affect user perceptions and potential adoption. Understanding how users view encryption and secure messaging could provide guidance to spur adoption, increasing network effects and leading to more secure communication for everyone.

To this end, we conducted an experiment in which participants saw one of several app description pages for Soteria, a messaging app we made up, then answered some questions about it. The descriptions were constructed to describe Soteria’s security features in different ways: using different terms (“secure,” “encrypted,” “end-to-end encrypted,” and “military-grade encryption”) and either on by default or only on when selected.

We found that the different app descriptions influenced users’ perceptions of Soteria in nuanced but important ways. Compared to “secure,” describing Soteria as “encrypted” or “military-grade encrypted” increased perception that the tool was appropriate for privacy-sensitive tasks. In contrast, the more precise (and potentially more secure) “end-to-end encrypted” did not have that effect. This finding may help to explain why users turn from E2EE tools to less-secure alternatives for sending confidential information. We also found that while encryption in general is no longer stigmatized, participants did think users of “military-grade encryption” might be paranoid, even though they were uncertain of the term’s meaning.

Since encryption was (at one time) viewed as furtive or unnecessary, we wondered whether describing encryption as on by default would be less favorable to users than advertising the option to turn it on by request. We found no evidence of this, however; instead, participants mostly expressed that on-by-default would be more secure and less prone to error.

Fifteen years ago, Gaw et al. predicted that making encryption automatic (as it is today in many messaging tools) might remove some of the social stigma related to secrecy, flagging, and paranoia. We found evidence that they were (mostly) right: our participants found most versions we tested appropriate for general-purpose, non-confidential tasks (with the possible exception of “military-grade encryption”) and tended to view security features as a benefit rather than an annoyance.

Our findings have implications for the design and marketing of secure communications apps. Terms that are too vague (“secure”) don’t inspire trust, but terms that are too specific and poorly understood (“end-to-end encryption”) are not much better. Tool vendors need to strike a balance: extreme security language can increase perception of privacy but also hurt acceptability as a general-purpose tool. More work is needed to further examine trade-offs signaling privacy while promoting broad adoption.

This material is based upon work supported by the United States Air Force and DARPA under Contract No FA8750–16-C-0022. Any opinions, findings and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the United States Air Force and DARPA.

--

--

Omer Akgul
Sparks of Innovation: Stories from the HCIL

PhD student at UMD CS, figuring out human factors in security and privacy.