Perception of privacy risks of wearables

There has been an increased awareness of health risks among people over the years. In parallel, technology has developed significantly to now enable the collection and analysis of personal data through wearable devices. This has led to ‘self-tracking’ as a phenomenon. In 2015, one in five American adults owned at least one wearable(1). 78.1 million wearable devices were sold last year which is 171.6% more than the year before(2). The high-fueled growth and resulting ubiquity of these devices raises critical privacy and security concerns. This report presents an overview of what experts believe the risks are and the risks are perceived by the public. The report suggests areas of further investigation and possible approaches to communicating the risks to people.

Experts have established that wearables still pose significant security risks in terms of technology(3). The major themes are -

1. Identity theft by gathering personally identifiable information

2. Profiling to target or discriminate against people based on personal information or health activity

3. Locating people and stalking based on inferred patterns in location data

4. Embarrassment and extortion based on activities or capturing photos or videos

5. Corporate use and misuse of employees’ and customers’ data(4)

Wearable devices are a particular kind of products that belong to a larger network called ‘the internet of things’ or IoT. More than 70% of cyber security and IT professionals believe that IoT manufacturers aren’t doing enough to maintain data security and that the standards need to be updated. 84% of them think that the manufacturers don’t make consumers aware of the kinds of information the devices can collect(50. Wearable devices are constantly collecting personal data, often without the person realizing that data is being generated and shared constantly. Companies that make fitness trackers can collect and reveal users’ data without their explicit consent. In 2011, Fitbit accidently made details of 200 users’ sexual activity public(6).

A study found that 86% of internet users had taken measures to protect themselves from unwanted surveillance while they were using the internet(7). Personal privacy on the internet is a concern for the average citizen and they take considerable precautions to protect themselves(8). But how does people’s perception of risks on the internet translate to perceived risks of newer technologies like wearable devices? According to a study of 1,782 Internet users, the most common risks associated with owning a wearable device were also ‘privacy’ (25.32%) and ‘being unaware’ (15.40%) (9). ‘Privacy’ referred to privacy and security concerns similar to the experts’ findings above. ‘Being unaware’ included ‘being unaware of what the device is collecting, doing or which information it is using’. These concerns broadly matched those that the experts had, despite the gap in technical knowledge and specifics of the how the technology works.

‘Loss of privacy’ is a very broad statement. On closer analysis of the study, the risks that people found most upsetting were videos or photos of them unclothed or that were otherwise embarrassing. Experts think of this as a possible risk too. The second kind of risk was related to personal financial information like bank account details, social security number and credit card information. This relates to the first point raised by security experts — risk of identity theft. What is interesting is that the least upsetting risks were exercise patterns, moods and emotions, heart rate and gender. These clearly relate to the experts’ concerns about profiling, corporate misuse and drawing inferences based on patterns of activities.

In the study, participants’ risk perception of 72 risks were measured in relation to each other. Participants might have responded differently if they were asked to rate the risks individually or in smaller sets. The study doesn’t investigate the reason for people’s perception that some things are riskier than others. One possible explanation is that people are more concerned about the short term risks and find it difficult to imagine security threats that are more technologically advanced or complicated. People tend to view each of their devices as stand-alone tools but when there is an ‘internet of things’ single data points from many individual devices or across a period of time can reveal a more complete picture of the customer than the customer herself realizes.

Security risks aren’t likely to be a key factor of a purchase decision — features, styling, brand, battery life and durability are more important for people. They may not know or understand aspects like what kind of data is collected, how often it’s tracked, how companies manage security and sharing of personal data with third parties. Furthermore, the actual risk probabilities are unknown since they are more or less opaque in our technological landscape. Further studies need to be done in this particular area of technology to understand how exactly risks are being perceived at a granular level in a dynamic context and how the benefits are weighed against it.

The authors admit that 83% of their participants didn’t own a wearable. Though this distribution is consistent with the national figures, it can be argued that people who do own wearables have a different relationship with them and negotiate the risks differently over time. People who invest in buying fitness trackers do so to get healthier. They have a vested interest in making it work for them and this can cause them to overlook privacy risks. Once they’ve established a habit with the fitness trackers, they could start viewing it as a social actor(10) causing them to further discount the nature of the ecosystem of technologies that makes their data vulnerable to attack. Neither does the study probe into understanding if the owners of wearables take steps to safeguard their privacy while using them.

Though wearables pose considerable security and privacy risks. It’s not an easy resolution since companies that create fitness trackers and collect the data sometimes don’t know what data they’re going to collect or what the patterns of big data will reveal. Their business model may be centered on ‘data innovation’. In these cases, it would be hard to explicitly state the privacy risks. Collecting personal information, locations, habits and activities may not reveal much in isolation but when analysed in conjunction and with other data sets, could be surprisingly informative. Companies might sell this data to third parties who use it in combination with other datasets to make credit, insurance and employment decisions. This raises a larger policy-level question about ownership of data — does it belong to the consumer who creates it, the company that collects it, those who aggregate and analyses it or those who invest resources to store it?

Other than updating the software to include better data encryption methods, tracking prevention and providing privacy from third party analytics(11), companies need to clearly communicate the risks within the privacy policy. This communication shouldn’t just be in the form of statistics or probabilities, since those have their limitations(12). When subsequent changes are made that affect the risks, consumers should be asked for explicit consent. Rather than terminating services to those consumers who don’t provide consent, features could be limited to those that have lower risks based on the consumer’s risk tolerance. Hardware features, like a physical switch for the Bluetooth to reduce continuous exposure to risks, could be incorporated.

There are significant challenges and so far the issues have remained in the domain of technologists and cyber-security experts. As designers and policy makers it’s imperative to understand how people’s relationship with a personal device informs their perception of the actual risks, not so that we can take advantage of it but so that we can communicate risks better and design to reduce actual and perceived risks.


1. Comstock, By Jonah. “PwC: 1 in 5 Americans Owns a Wearable, 1 in 10 Wears Them Daily.” MobiHealthNews. 2014. Accessed September 28, 2016.

2. “The Worldwide Wearables Market Leaps 126.9% in the Fourth Quarter and 171.6% in 2015, According to IDC.” Accessed September 28, 2016.

3. Goyal, Rohit, Nicola Dragoni, and Angelo Spognardi. “Mind the Tracker You Wear.” Proceedings of the 31st Annual ACM Symposium on Applied Computing — SAC ’16, 2016. doi:10.1145/2851613.2851685.

4. @threatintel. “How Safe Is Your Quantified Self? Tracking, Monitoring, and Wearable Tech.” Symantec Security Response. Accessed September 28, 2016.

5. “Press Release.” ISACA Survey: Wide Gap Between Consumers’ and IT Professionals’ Perceptions on Internet of Things Security. Accessed September 28, 2016.

6. @ChrisMatyszczyk, By Chris Matyszczyk. “TMI? Some Fitbit Users’ Sex Stats on Google Search.” CNET. 2011. Accessed September 28, 2016.

7. Rainie, L., Kiesler, S., Kang, R., & Madden, M. (2013). Anonymity, privacy, and security online Pew Internet & American Life Project. Retrieved from

8. “Business Week/Harris Poll: A Growing Threat.” Accessed September 28, 2016.

9. Lee, L., Lee, Joong H., Wagner, D., Egelman, S. “Risk Perceptions for Wearable Devices” arXiv:1504.05694 [cs.CY]

10. Fogg, B. (2003). Computers as persuasive social actors. Persuasive Technology, 89–120. doi:10.1016/b978–155860643–2/50007-x

11. Gigerenzer, G., Hertwig, R., Broek, E. V., Fasolo, B., & Katsikopoulos, K. V. (2005). “A 30% Chance of Rain Tomorrow”: How Does the Public Understand Probabilistic Weather Forecasts? Risk Analysis, 25(3), 623–629. doi:10.1111/j.1539–6924.2005.00608.x

12. Wells, G. L. (1992). Naked statistical evidence of liability: Is subjective probability enough? Journal of Personality and Social Psychology, 62(5), 739–752. doi:10.1037//0022–3514.62.5.739

Shruti Aditya Chowdhury

Written by

Designer | Pragmatic Idealist | INFJ