A Better Privacy Policy?

Anmol Parande
MDBlog
Published in
6 min readFeb 15, 2019

For most Internet users, “Privacy Policy” and “Terms and Conditions” are just boxes they have to check off whenever signing up for a new service. Filled with legalese, presented in intimidating fonts, and containing thousands of words, it is no wonder that they are ignored. If companies really wanted, they could put ridiculous requirements inside their terms and conditions and few would bat an eye. In fact, when researchers created a fake social network in which the Terms and Conditions required giving up the user’s first-born child as payment, 98% of people creating an account consented (See research paper here). The issue goes beyond Terms and Conditions, a Pew study found that 52% of Internet users thought a Privacy Policy maintained the confidentiality of collected user information. In reality, a Privacy Policy tells users what data the service collects and how they will use it.

This issue has not gone unrecognized. In the U.S, legislation such as HIPAA (Health Insurance and Portability Accountability Act) and COPPA (Child Online Privacy Protection Act) are meant to protect health data and child privacy respectively. In the E.U, GDPR (General Data Protection Regulation) was passed in 2016 and went into effect in 2018. Whereas HIPAA and COPPA set down specific regulations on how data must be handled, GDPR is a broader piece of legislation which caused turmoil as companies rushed to update their privacy policies and data practices to comply. Although it is a European law, GDPR affects U.S companies as well since many of them serve Europeans.

However, as comprehensive as these regulations are, they have not fixed the core issues regarding privacy policies. In 2014 (before GDPR), an analysis of the privacy policies of Fortune 500 companies found that each policy would take 7 minutes on average to read at a rate of 250 wpm. Thus, if people were to actually read a privacy policy, they would spend more time reading than they intended to spend on the site in the first place. Time aside, the same analysis found that 82% of the policies required a college level reading ability, a requirement which 75% of U.S adults don’t meet. With GDPR, these documents will get even longer. In addition to the standard information about what information is collected and how it is used, GDPR also requires notifying users of their data rights, if data is used for automated decisions, and where their data is stored among many other things. Moreover, GDPR continues to allow users to simply check boxes indicating their consent to a company’s privacy policy without even being showed any privacy policy. GDPR is still a fairly new regulation, so it is unclear how many more people will read and understand privacy policies as a result, but making these documents longer is unlikely to help.

In order for consumers to actually understand a company’s data practices, privacy policies need to be condensed. The conventional approach to condense them has been to merely call for transparency and clear, plain language. A better approach would be to call for the better use of icons and short summaries. Imagine if when creating a login for a website, alongside the basic checkboxes for accepting the Term and Conditions and the Privacy Policy, the user was presented with a simple list of icons and short sentences describing the service’s data use.

A simple mock-up of what this might look like on a website.

This approach eliminates most, if not all, of the problems privacy policies have now. Icons can give a general idea of the “what”, “how”, and “why” of data collection. A simple sentence next to each one is adequate to clarify the icons. Notifying the users of cookies, third-party trackers, and the myriad of other features which fall under the realm of privacy each take their own icon. The same concept carries over to mobile apps as well.

A simple mock-up of what this might look like in a mobile app.

This system is neither unprecedented nor a novel approach. For example, the Google Play Store uses a similar method to inform users of the permissions an app requires access to in order to function. Before the user can even download the app, they must accept that they are giving the app access to all the information it is requesting.

These permissions are simple and easy to understand; in other words, they actually inform the user. Under a system like this, the user would understand the privacy implications of using the service without spending a great deal of time reading an official document. In order to comply with the existing regulations, companies would also need to include a link to their actual privacy policy, but most users would never have to bother reading it with such a simple summary.

This system of icons can be easily customized for each business’ needs and will have a minimal impact on user experience since it does not require reading dense paragraphs of text. Given how obvious this approach seems, the main question becomes “Why hasn’t it been put into practice yet.”

The heart of the issue is that businesses simply do not think it would be to their advantage to do so. The most cynical perspective is that companies do this for profit. Companies whose business models rely heavily on advertising worry that notifying users of all of the data collected about them will drive people away from their service. Concealing their data practices among dense legal texts gives them freedom to operate as they want.

However, cynicism aside, there is another large factor which drives convoluted privacy policies: a disconnect between how comfortable executives think consumers are with sharing data and how comfortable consumers actually are about sharing data. A Deloitte report found a difference of 29% between consumers and executives when asked if they agreed with the statement “Consumers believe the risks of sharing personal information is worth the product recommendations they receive.”

Although report was primarily concerned with the consumer product industry, it does demonstrate a general trend regarding consumer privacy. Targeted advertising in general is seen by many executives as a consumer benefit, but not as many consumers think so. The disconnect does not end there. The same study found a difference of 13% between executives and consumers regarding whether or not “most consumer product companies are adequately protecting consumer’s personal information.” In other words, not only do consumers dislike businesses using their data for advertising, many do not trust businesses to take care of their personal data.

These disconnects explain the large gaps between the responses of executives and consumers regarding what measures would increase trust between consumers and companies. In general, measures which consumers were more likely to favor to increase the trust they have in businesses were less likely to be favored by executives. The largest gap (32%) pertained to having more understandable privacy policies.

Even though Deloitte’s report was not directly studying consumer attitudes towards tech products, the preferences reflected in the consumer and executive surveys can easily be generalized to the broader sentiments towards privacy and data collection. The type of data collected for consumer advertising is not incredibly different from the data collected by tech companies. In fact, the data collected by tech companies is arguably more invasive due to the sheer volume of data available. Thus if consumers are wary of consumer product companies with their limited data, they must be extremely suspicious of tech companies.

Even without Deloitte’s report, it is clear that the general consumer sentiment towards privacy is now changing. As more people expose large players such as Facebook and Google for their data practices, the spotlight on privacy is growing brighter. Some executives have recognized this and are starting to move towards more favorable privacy regimes, but there is still a long way to go.

As for legislation such as GDPR and any potential regulations the United States passes in the future, they will be ineffective if they only give the appearance of increasing transparency and privacy. Currently, they serve to make privacy policies longer, further obfuscating how companies handle data. The policies included in legislation such as GDPR might benefit the educated Internet user who is aware of standard data collection practices as well as their personal privacy rights, but they do nothing for normal users, many of which have only a basic familiarity with technology itself.

If informing consumers about their privacy rights is ever going to become as simple as a list of icons and phrases that any Internet user can understand, the force behind the change must come from the consumers. Businesses have to feel the impact of lower profits due to a lack of trust. Only that will start to bridge the disconnect between executives and consumers, bringing them into agreement on how to maintain a bottom line while safeguarding personal rights.

--

--

Anmol Parande
MDBlog
Writer for

Flight Software Engineer at Astranis Space Technologies