Personalization for Privacy Recommendations

Hue Watson
GT Usable Privacy and Security Course
3 min readFeb 15, 2019

Description: I attended the GVU Brown Bag Seminar today (2.14.19) where Bart Knijnenberg, and Assistant Professor at Clemson University, introduced some of his research on making privacy more usable and accessible to the general population. He talked about how current privacy practices don’t work and how personalization is necessary. One method that was mentioned to make security and privacy more fun and accessible was to use cartoons to describe the security danger/warnings or privacy details. This is an easy way to make security and privacy concepts not only more understandable, but also more memorable and engaging (mentioned in the talk was also the bonus of making it more accessible for people with low literacy or dyslexia). Among a few research projects, one in particular stood out about a system that gave privacy recommendations based on specific user preferences. The purpose of the system is to figure out what users want. By measuring user characteristics and behaviors they found that four types of data types existed that people wanted to share: Facebook activity, Location, Contact info, and Life/Interests. Five user profiles were parsed from the data, ranging from people who wanted to share everything to people who only wanted to share certain types of data. The system would then give privacy recommendations based on a Machine Learning algorithm. The most advanced ML algorithm used was correct 82% of the time, an increase from the default settings of allowing everything to be shared (correct 28%) or allowing nothing to be shared(correct 70%).

Relation: I thought that this was an interesting application to our discussion today regarding how to command attention for security warnings. Like discussed in class — it would be interesting to tailor security and privacy to meet users’ needs— make it more personalized for users in a way that would entice them or make them notice. Also mentioned in class was the notion of making security and privacy more fun and engaging. With the use of cartoons or characters that are relatable, or even with gamification of systems, this could be a more interactive and pleasurable experience when relaying security and privacy information. Currently, developers have a specific notion of what privacy and security systems should be doing — but these are not necessarily usable. If users can’t use them or ignore these features, what is the point?

Implication: Bart had three main takeaways that I would like to repeat when designing privacy systems with a user-tailored approach: this method “1) relieves some of the burden of controlling privacy, while at the same time respecting each individual’s preferences, 2) provides realistic empowerment: the right amount of transparency and the right amount of control, 3) refrains from making moral judgements about what the “right “ level of privacy should be.” Privacy is highly personal, so it is important to understand the nuances between individual needs. People also have a certain stigma towards talking about privacy and security because not only are these topics personal, but they don’t want to be judged on whether they are doing what they should be and if they could be doing more. Allowing personalization is a good step towards breaking the high entry-level of the security and privacy usability wall.

--

--