It’s Time for Brands to Develop a Code of Ethics for Data Usage

The urgency of aiming for a higher standard & four key principles to consider

IPG Media Lab
IPG Media Lab
8 min readAug 22, 2019

--

Photo by Joshua Sortino on Unsplash

In the face of increasing consumer awareness over data privacy concerns, it is more important than ever for brand advertisers to maintain consumer trust by handling their data with respect. All over the world, laws and regulations are being put into place to protect consumer privacy and penalize bad actors. But companies shouldn’t wait for the law to provide a guideline for how to conduct themselves in terms of data collection. It takes time for laws and regulations to pass through the bureaucratic pipeline and go into effect, thus they often lag behind consumer expectations and perceptions. Instead of legality, drawing a moral and ethical line is what brand marketers should be focusing on.

Why Every Brand Needs One

Companies need a new code of conduct in regards to data collection and usage, because consumer expectations around data privacy and security are quickly rising. They are getting savvier with their approach towards data privacy after years of data breaches and privacy scandals plaguing headlines and nudging into mainstream consciousness. Naturally, with heightened awareness comes an increased sensitivity towards potential issues. Perception moves quicker than logics and press releases, and if your data practises rub consumers the wrong way, it would be hard work to get back into their good graces.

Increasingly, opting out of data collection and targeted ads at the expense of less personalization is becoming a premium feature that consumers value. The privacy-heavy messaging that Apple’s marketing has harping on is a good example of positioning data privacy as a premium feature. But data privacy shouldn’t be a luxury. It should be something that all consumers are entitled to. Plus, given that completely opting out of collecting data is not a viable solution for any brand — if you’re not doing it, chances are your competitors are and putting that consumer data to use — brand advertisers will have to take the initiatives and start thinking about ethical data use and collection and make efforts to fix the unbalanced data-value exchange before the legal system intervenes.

Given the complicated time we live in, sometimes consumer perceptions are even tied to bigger social-political contexts. Earlier this summer, a set of aging filters from an app called FaceApp gained viral popularity on social media. Before long, media outlets started raising concerns over concerns about the app’s data policy and warning people of the app’s Russian ownership. It’s worth noting though, that the FaceApp’s terms of service are not that different from Facebook and despite Russian ownership, the servers for the app are all US-based. Although the bad press over FaceApp might be overblown, its perceived sovereignty of data was enough to give people good reasons for doubt, reflecting consumer’s growing distrust of data collection and the internet following the high-profile Cambridge Analytica scandal.

The ugly truth is that the method behind what Cambridge Analytica did have been common practice in the commercial realm. It was only when significant political consequences were revealed that people started to pay attention. But consumer expectation is never stagnant; it tends to travel across domains, and soon the backlash of unethical management of voter data by Cambridge Analytica snowballed into a larger movement against Facebook’s (and other tech giants’) mismanagement of consumer data, be it substantiated or perceived. Souring consumer perception, in turn, gave lawmakers and regulators the ammunition they need to start floating antitrust motions against big tech this year with growing momentum. With time, it is entirely possible that this consumer backlash will spill over from big tech to advertising as well. Therefore, it is paramount that brands proactively establish a guideline for ethical data usage and diligently adhere to it.

Photo by Nathan Dumlao on Unsplash

Key Principles to Consider

No two companies have the same code of conduct, for that document needs to reflect the company culture and brand mission to be effective. Similarly, companies should be free to come up with their own ethical guidelines regarding data privacy based on their respective business objectives and regional cultures. That being said, there are some key universal principles that all brands should consider when drafting their code of ethical data usage,

Transparency

Like any good relationship, honesty is the best policy. A study by SAP found that 79% of consumers will ditch a brand if they learn their personal data is being used without their knowledge. Transparency regarding the purpose and methods behind data collection is key to build consumer trust and clearly communicate the value proposition. The goal should not be self-serving but rather mutually beneficial, with a focus on enabling consumers to grant informed consent to allow trusted brands to access their personal data.

Brands can take a cue from how Google, the king of search and email data, is framing their data collection practices and communicating the benefits they offer. At its annual I/O developer event in May, the Alphabet company devoted significant stage time during its opening keynote to clearly state why each of their new and updated services are collecting user data and the benefits users will gain in return for granting access. The result is neither defensive nor pandering, but rather came off as an honest, well-articulated bid for winning consumer trust.

Moving forward, consumers should and will likely demand to have full visibility into how extensively their personal data is being monetized. Facebook’s big promise after the Cambridge-Analytica scandal broke was that they will update data privacy tools to make the whole process more transparent. New features include a redesigned settings menu on mobile devices, a privacy shortcuts menu, and a tool called “Access Your Information” were quickly introduced, but educational efforts to raise awareness among Facebook users regarding those tools are still severely lacking.

Accountability

As the backlash against tech platforms builds, the big tech is increasingly being held responsible for questionable data practices and incompetence in regulating hate speech and misinformation. To their credit, Facebook and the likes have taken some commendable steps to address those pressing issues. Last week, Facebook suspended HYP3R, a marketing firm, for allegedly scraping public Instagram profiles, saving content, as well as location data in breach of Instagram terms. In addition, Facebook is also making the software used to detect harmful content open source to crowdsource ideas for improvement.

Similarly, brands need to take proactive measures to ensure the data they collect will be accounted for. Security and data-anonymization are now table stakes for accountability, and effective protocols need to be put in place in the unfortunate event of a data breach. It’s not just accountability in terms of legal compliance, but also being accountable for ethical grey areas as well. In other words, a company culture that emphasizes the responsibility that comes with protecting user data should be cultivated.

To achieve that, brands need to structure themselves organizationally so that every department that touches consumer data should be held accountable for data security and ethical usage, with designated individuals leading the charge in implementing policies and practices for accountability. This also means that appropriate technical and organizational measures are needed for brands to clearly demonstrate what they did with the data they collected and prove the effectiveness of such measures when requested. At its core, accountability should be a built-in feature, not a fail-safe system.

Moderation

In order to make accountability an achievable goal, a healthy degree of moderation is required to help brands make sure that their data collection practice is manageable and within reason. Adopting a value-driven approach would a smart strategy to guide which dataset to collect and which to disregard. After all, there is little use in collecting all the user data — doing that will only result in useless data redundancy and clusters that obstute the real consumer insights that brands could’ve gained with a leaner set of data points.

In practice, this means brands should only collect consumer data that they can prove to be useful in improving customer experience. Moreover, this should be implemented at an individual level, which means routinely purging anonymized data that isn’t relevant to improving a specific user’s experience after trends and insights had been learned.

Applying moderation to data collection also means that brands should consider eliminating internal processes of acquiring third-party data and use only data that they have earned through explicit customer consent via owned consumer touchpoints. While this may seem like a radical idea given the common practice of data-sharing and how the current ad tech industry works, it is still an idea worth considering, if only for the sake of setting the ethical parameters for data practices.

Another area that requires exercise of moderation is data retention. Holding on to all the data you collected forever is not only unnecessary and counterproductive, but also adds to the security risks in the long run. A routine data-combing to ensure you only keep what you need, with the granularity you need, for as long as you need. Specific tactics in data management, such as granular data labeling, when implemented effectively, can help brands make better use of their datasets and identify the extraneous data points that should be excluded

At the end of the day, moderation can serve as an antidote to the kind of capitalistic greed that often compromises ethical practices. Following the principle of moderation is the key to developing an ethical data practice that puts consumers’ benefits front and center and keep things elegantly manageable.

Data Sovereignty

Data sovereignty refers to the idea that data is subject to the laws and governance structures within the nation it is collected. With the development of cloud computing, however, the geographic boundary is no longer the only criteria for data sovereignty. In response, many countries have passed various laws around control and storage of data, thus requires brands to incorporate this concept into their ethical data guidelines.

Customers from different cultures have different expectations of data privacy, so international companies need to exercise discretion when it comes to cross-border or regional data collection. Chinese companies and consumers, for example, may have a different set of expectations when it comes to what is considered ethical in data collection, and it is up to each brand to decide how to accommodate those regional differences without compromising its overall principles. Generally speaking, it is safer to err on the side of constraint in terms of transnational data collection.

As the aforementioned WhatsApp example demonstrates, data sovereignty also significantly impact how consumers perceive brand trust and data security. Last year, Apple received much flak from privacy watch groups when it was forced to transfer the servers for storing Chinese iCloud user data from abroad to a state-owned operator in China, but ultimately, Apple had no other choice without losing access to its entire Chinese market. While this points to the limits of ethical decision-making when it clashes with corporate interests, it also serves as an interesting case for how data sovereignty may affect consumer perception of a brand’s ethical standing in privacy practices.

A Higher Standard

At a time when consumer trust is at its most vulnerable, when the laws and regulations are falling behind on consumer protection in the face of technological advances, it is important that brands and advertisers should aim for a higher standard that replaces legal guidelines with an ethical code of conduct. Doing what is legal is the bare minimum that puts valuable consumer trust at risk. Only by adhering to a higher standard can the ad ecosystem continue to bring value to consumers and earn their trust.

Additional Resources:

  • ID2020 project — a nonprofit public-private partnership committed to improving lives through digital identity
  • An Interactive map that compares data protection laws around the world by DLA Piper
  • Acxiom Whitepaper — The New Codes of Conduct: Guiding Principles for the Ethical Use of Data.

--

--

IPG Media Lab
IPG Media Lab

Keeping brands ahead of the digital curve. An @IPGMediabrands company.