Providing consumers with privacy safeguards in the age of the data-driven economy is not only achievable but also a sound business strategy.
We live in a data-driven economy. Our digital devices are continually keeping track of our actions and preferences, and using this data to determine the best content to serve us. That content might be anything from news stories to videos to advertisements. And with our data, businesses know what media experiences are most likely to make us “tick” the way they want us to.
But what about what the consumer wants? While consumers are embracing expanding opportunities to enjoy media experiences that uniquely tailor to their personal preferences, there’s also the question of, “what aren’t they seeing?”
Consumers must be able to move between a digital environment that’s tailors to them based on their habits and tastes, and an “anonymous” environment. This choice — between the convenience of a personalized environment and anonymity — relies on privacy.
Why do we gather consumer data?
It’s easy to understand why the apps we use and the websites we visit collect data about us. By tracking which articles we read or which videos we view, our devices learn about our preferences and can then predict what content and ads we’ll grant our attention and interest.
For the consumer, this results in a more pleasant experience with better, more accurate content discovery against an almost infinite number of options. In exchange for allowing data gathering, consumers enjoy more personalized content and recommendations. For the business (the app, or the website, or the advertiser), this increases the chances that consumers will remain loyal, will spend more time engaging with them, will make purchases, etc.
The privacy challenge
Unfortunately, many media and ad tech companies aren’t taking privacy seriously. While technology develops at alarming rates, and legislation surrounding what businesses can and cannot do with consumer data is catching up, the boundaries dictating what is right and wrong concerning consumer privacy are blurred.
Further complicating is that users have different desires and expectations surrounding their privacy. There is no “one-size-fits-all” approach to privacy that would satisfy all consumers.
Still, there is a more pressing need than ever for both consumers and businesses to consider the privacy dilemma. From the consumer perspective, users must think about what could go wrong if their data was misused or collected without their agreement. Especially considering the growing trend among consumers to stay loyal to brands and businesses that align with their personal beliefs, the time has come for companies to align their actions with ethical behavior surrounding consumer privacy. Organizations must give consumers the option to understand, define, and agree to their privacy settings.
The core of consumer privacy
From the business perspective, the trade-off between privacy and personalization is stark and explicit: the more data they collect, the more effectively they can offer consumers a personalized experience and further monetize the media via targeted advertising. But consumers now demand transparency and control over the ways the organization uses its data.
At the core of consumer privacy are three fundamental principles: transparency, control, and security. Let us go through all of them one-by-one.
At the heart of consumers’ privacy are data gathering and sharing. With the growing popularity of IoT and connected devices, the data gathering goes beyond computers and smartphones and into appliances like televisions and coffee makers.
In the context of privacy, consumer transparency is critical to the way data is gathered and used to (re-)gain the trust of consumers. Some legislations address this: GDPR, or the General Data Protection Regulation, sets forth clear regulations regarding data protection and privacy for citizens of the European Union and the European Economic Area. The California Consumer Privacy Act, or CCPA, signed into law in 2018, established groundbreaking new consumer privacy rights. This bill enhances privacy rights and consumer protection by, among other things, restricting the resale of personal consumer data to third parties without first notifying users and giving them the option to opt-out.
Most data brokers (entities that collect information about users and then sell that data) and Data Management Platforms (or DMP) offers consumers the option to opt-out of having their data collected, analyzed, and traded. However, most consumers aren’t even aware of the existence of these parties, let alone how to opt-out of having their data gathered. Once a company sells the data, it’s challenging for consumers to have it deleted, or for them to stop data brokers and marketers from using it.
Without a portal where consumers can quickly learn about their privacy-related rights and take actions regarding what information they choose to share, it’s difficult to say how consumers can take proactive measures to keep their information private. Technology can further support the effort for greater visibility. The blockchain innovative way of linking operations and securing their immutability in a distributed ledger-based on cryptographic algorithms promotes clarity across the ecosystem. The industry must step forward and take responsibility for the ways they use consumer data. By informing users, and allowing them to proactively define what information they’re prepared to give away to what specific services, there remains a two-way value exchange.
This transparency goes beyond the cookie consent banner and is more accessible and usable than mere “legalese” in a Terms of Services page, allows businesses to gain a tremendous competitive advantage in today’s data-driven economy. Indeed, giants like Google, Facebook, and Amazon have already established this transparency, and are pushing others in the industry to make similar moves.
Nowadays, consumer privacy is under attack at every corner. The ability for technologies to extract user information regardless of consumer protections and liberties is growing at alarming — and fearsome — rates. Even those indivudals who actively try to protect their privacy are no longer safe from intrusion.
Making this reality a bit more complicated is that consumers want, and indeed, demand, control over their data. While they want the ability to be anonymized sometimes, at other times, they also want to enjoy the convenience of a personalized relationship with the technologies, media, and services they interact with daily.
New concepts are starting to address this situation. The idea that consumers are personally responsible for their data begins to take shape, for example, through smartphone plans that leave it up to the user to define what data is open for consumption. This approach helps assuage the issue of control over consumer data while leaving space for the data economy to thrive.
Another approach involves employing targeted mobile advertising that considers user privacy. Of course, consumers don’t want the conveniences of the Internet to go away. So systems can be developed where technologies and Internet services that consumers interact with can function according to the consumer’s choices about what publishers they agree to receive targeted advertising. This approach is relevant for connected, “smart” devices proliferating within our homes. A “privacy hub,” enabling users to define their privacy settings for all of their devices, can ensure that the ecosystem uses consumer data lawfully or ethically.
The key to privacy control is a common understanding of data usage. This definition relates to what data users choose to share (or not to share), and gives consumers the “right to be forgotten.” This notion is an important right, provides individuals with the power to have their “personal” data erased. This right stretches back to pre-GDPR times when Google was sued by Mario Costeja Gonzalez to suppress specific search results about him. Most privacy legislation built this notion into their text.
We must note that, if there is confusion or misunderstanding from either party regarding consumer data and privacy, a statutory body could declare any agreement between the two parties as invalid. While consumer data is the “payment” users make for free services like those offered by Google and Twitter, it could be worthwhile for users to have the ability to pay for those same services the old-fashioned way, while choosing to keep their data private. Although most might not opt for this option, it would create a needed and transparent trade.
Consumers can’t expect for security measures to be 100% shatterproof. There will always be data breaches, leaks, and other intrusions. With this, there is a need for businesses to take full measures to protect the personal data of their customers. Most companies are now starting with this default position putting users in a safer place.
Among the best methods for protecting data, whether it’s in storage or in transit, is through encryption. A best practice for consumers is to ask for data minimization, which requires that businesses don’t use or collect more data than they need. Another best practice is pseudonymization, which involves a de-identification procedure where personal information within a record is replaced by artificial identifiers or pseudonyms. These measures help ensure that as little information about the consumer can be accessed in a data leak or breach.
Fortunately, an increasing number of systems are already making these essential capabilities readily available to users. At the same time, it’s more important than ever that businesses — specifically ad tech companies — take control over their data practices to minimize the likelihood of hackers stealing user data.
In light of the current reality surrounding consumer data and privacy, it’s time for the industry to evolve toward a customer-first mindset focused on privacy. With the consumer at the main focal point, companies must continuously innovate and experiment to find the best ways to protect their consumers’ privacy.
The value of consumer data for both the consumer and the companies they do business with is clear. In light of its value, organizations must take great care of their data. Beyond legislation, companies should consider the ethical perspective and the intention behind their user data- and privacy-related actions.
Sound privacy principles call for transparency, control, and security. Users should be able to enjoy levels of power that enable them to choose how their data is used, and companies must concurrently take measures to keep that data secure. Organizations must design proper data privacy by leveraging technologies such as encryption, pseudonymization, and blockchain to successfully supports their consumer privacy goals.
By fostering, promoting, and enabling an ethical and responsible data-use culture with privacy at the forefront, we can foster better use of data that supports a flourishing data economy. Indeed, this is paramount to the continued growth of the industry and ensure we are all giving consumers what they need and want successfully.