2 Unpopular Predictions for the 2020s: The Decade of Consumer Privacy

Shoshana Maraney
IdentiqProtocol
Published in
9 min readDec 26, 2019

“In the ’20s and ’30s it was the role of government. ’50s and ’60s it was civil rights. The next two decades are going to be privacy. I’m talking about the Internet. I’m talking about cell phones. I’m talking about health records and who’s gay and who’s not. And moreover, in a country born on the will to be free, what could be more fundamental than this?”
- Sam Seaborn, West Wing

Aaron Sorkin might have been a decade or so out when he wrote this, but in principle, he was spot on. Like much of the best literature and film, he was prescient. In fact he could have been writing for right now. As we head into the 2020s, so much of our discourse revolves around privacy — both what it is, and what we want it to be.

It’s the role of art to hold a mirror up to life. When it comes to privacy, it seems that it’s a pretty black mirror.

Of course, technology’s impact on privacy and what we should do about it has been a part of the popular consciousness for a long time. George Orwell first published the famous 1984, with its apparently prophetic depiction of a society under constant surveillance, in 1949. But it was in 2017 that UK sales of the book rocketed up by 165%. And 2019, the book’s 70th anniversary, saw a flood of commentary comparing the book’s fictional setting and invasion of privacy to our modern world.

There’s a lot of debate, and in particular a lot of dissatisfaction with the state of data privacy today. But you can sense a change happening. The very fact of privacy’s sudden prevalence as a subject is cause for hope. When it comes to privacy two things seem clear: We have a long way to go. And all the signs are that the coming decade is when we’ll get there.

We think we can see some of what’s ahead — though we’re aware our predictions aren’t going to be that popular.

Privacy has become a recurring theme in popular culture, which is significant because it both draws on and increases a shared sense that this is one of the vital issues of our times.

Wondering what the future of privacy will look like? Here’s our two cents, in the form of two (unpopular) predictions — with some gifs along the way.

Unpopular Prediction Number 1: Business, Not Law, Will Ultimately Drive Privacy Changes

Privacy is a theme that has been picking up over the last few years — in the media, on social networks, on the legislative floor. And the standard assumption is that businesses will not only lag when it comes to making needed changes, but probably resist attempts to improve things. Many commentators have predicted that only regulatory changes will be able to shift the needle.

We don’t think that’s true. It’s understandable, because companies have profited from the very free and almost unregulated approach to data. But it’s missing the bigger picture.

Not that legislation isn’t important. It is. Legislation provides the necessary motivation for change. It kickstarts the movement. But it isn’t enough. Business needs, on the other hand — once there’s the incentive — will be more powerful.

There are two points here. Firstly, there’s the shift in perspective many large tech companies are now undergoing — from a strong emphasis on growth, to a more mature and long-term approach, which necessarily includes consideration of data privacy.

Let’s think about the TV show that typified the tech world on our screens. Data privacy was the central theme of the finale of Silicon Valley, a show whose popularity has come from its on-point reflection and satirization of the startup and Big Tech world.

What was really interesting about the end of Silicon Valley, though, was the lack of debate between the main characters about whether privacy was important. In the real Silicon Valley, privacy is a hot topic — there are ongoing, passionate, in-depth discussions about privacy versus profit, privacy and its potential to stop innovation, the different ways privacy could impact the brands and platforms we use every day. There’s a lot of discussion about what the role of technology companies ought to be in relation to privacy.

But not in the show. In the finale of Silicon Valley, they weren’t wondering whether we have a right to privacy, or whether technology companies have a duty to protect privacy. They didn’t discuss it at all. They assumed it and worked from there. As soon as the issue was on the table, the “right answer” was immediately apparent to all the characters in the room. The question was what to do about that.

The absolute absence of debate was striking, particularly in a show known for exploring the complex issues at the heart of today’s tech world. Much of the finale was set ten years from now, and it felt almost as though this attitude towards privacy was borrowed from the future. The characters taking the decision saw themselves, reasonably by this point in the show, as potential tech leaders, people who could have a significant impact on the world. They had moved from growth to relative maturity. And they thought business should act, independently of any legal requirements, to protect privacy.

The Great Hack seemed to present the same conclusion, if from a different direction. Esquire called it “the horror film of the year” and that’s certainly not for its dramatic special effects (spoiler: there aren’t any). What’s frightening is the harsh light the documentary shines on what happens when no one is really paying attention to data privacy.

This brings us to the second point. For years, none of us were paying attention, and that goes for both businesses and consumers. The potential of online options to save us time, money and effort were exciting and enticing, the opportunity to grow a company into an international success or connect parts of a business in ways hitherto undreamed of was dazzling, and for both businesses and consumers it took time for data privacy to emerge as the hidden cost.

That phase is officially over. Data privacy has emerged as one of the great and growing concerns of the end of this decade. The issue is now on the table, for both companies and consumers — and consumer feeling will make sure it’s something companies are taking seriously.

Heading into the next decade, we predict that businesses will become the primary drivers of the needed changes in our shared approach to data privacy. Not a popular opinion, but hear us out.

  1. It makes business sense. Consumer opinion (as reflected in the popular culture sources referenced here) has swung firmly round to the opinion that privacy matters. In fact, in a recent IBM study, 83% of consumers said they would stop working with a firm if they discovered their information had been shared without their consent. Why risk consumer loyalty when changing gives you a competitive advantage — especially when the change is probably on its way anyway — see 2).
  2. Future-proofing against legislative change protects a company. With GDPR already 18 months live, CCPA due to hit at the beginning of the year, and a US federal privacy law already on the table, it’s obvious that regulations are going to mandate change in this area in the not-too-distant future. Starting to assess and improve data privacy now means companies are less at risk, whatever the future holds when it comes to legislation.
  3. Corporate vision has expanded. The influential Business Roundtable changed its policy on the stakeholders a company is duty-bound to consider. Where it used to be just shareholders, now employees, suppliers, local communities and even the environment are relevant. That’s not only a suggestion — a change like this occurs to reflect something the group concerned are already thinking. The very notion of corporate responsibility is changing. It’s our bet that data privacy will be high on the agenda very soon, if it isn’t already.
  4. Don’t be evil. A lot of the companies that are most impactful in the privacy dilemma and debate are technology companies. Those companies have gotten a bad rap over privacy, but the fact is that many tech companies feel strongly about ethical issues, once they’re aware of them — and they want to do the right thing.

The 2019 Edelman Trust Barometer found that 55% of those surveyed believe CEOs can create positive change when it comes to personal data and 76% believe CEOs should take the lead on change rather than waiting for government to impose it. We reckon that’s just what’s going to happen.

Unpopular Prediction Number 2: Technology As The Solution, Not (Just) The Problem

Yes, technology is one of the sources of the privacy problem we’re facing today. Yes, it wouldn’t have happened without it. Yes, the dystopian fictions that have become so popular in recent years usually include some kind of technology-gone-too-far or, on the other hand, technology implosion, that causes the predicament. And yes, that feels very plausible. It’s true that the problem is caused by technology — that it’s the result of our sudden new ability to gather, store and most crucially process a huge amount of data, in amounts that would have been simply inconceivable a few decades before. But that’s not the whole story.

Popular dystopian trilogy The Hunger Games has Katniss Everdeen opposing the might of the Capitol whose technology seems to allow its leaders to see everything within its bounds. But here the punishing surveillance of the state is sometimes turned against itself, with the original rebels subverting the mockingjay recording birds to relay false information to the enemy, and Katniss and other victors using their publicity to promote the rebels’ message.

Mr Robot has an even more complex relationship with technology, which is shown as both dangerous and full of promise. It’s also a form of expression and interaction with the world, and the relationship different characters have with different elements of technology is explored as an integral part of their personalities and abilities.

While tech is often the source of various dangers, it is always clear that technology — allied with human ingenuity and morality — must also be the solution. That’s the situation we’re in today.

Up until this point, tech has been targeted as the source of our privacy problems. We let technological development run on too quickly and too freely, and now legislation is needed to rein it in and protect us. There’s some truth to that; legislation is certainly a part of the answer. But it’s not enough, and might not even be the decisive factor in reaching the right balance between convenience and privacy.

Historically, technology has been a key part of the solution to the problems it causes. Regulation, as we mentioned earlier, is often the motivation. But it’s not the solution. Public concern about pollution resulted in clean air acts all over the world — which led to technological innovations for both monitoring pollution and creating alternatives to the problematic practices and tools. Concern over airline emissions saw the same process kick off. Now the problems caused by our consumption of meat are (hopefully) going to be alleviated by laboratory creations.

We think privacy will follow the same pattern. Legislation will motivate change, but technology will be the solution. (In this context it is perhaps disturbing to contemplate the future of data relating to industries exempt under laws like GDPR — the data files that so alarmed this reporter would be unaffected, for example, where they’re held by companies in the exempt fraud prevention industry. Identiq’s privacy-first identity verification technology, however, would provide the solution here.)

Technology is itself a vital part of the solution, and early this decade companies will start exploring tech-drive ways to give consumers what they want in terms of both product and privacy.

Identiq is itself a perfect example of the beginning of this trend. It’s a technological solution to both the challenge of identity verification and the privacy problems inherent in the traditional validation model. Because Identiq has created a peer-to-peer network where absolutely no sensitive user data is ever shared, copied or in any way exposed, privacy is protected at the same time as identities are validated. This is in contrast to today’s practice, where personal data is routinely shared with third parties.

It’s a practical way to do what companies and consumers need — companies, so that they can prevent fraud, and consumers, so that they can transact and interact online easily and smoothly — which has respect for data privacy built in from the ground up.

Changing people’s practice and way of thinking is hard. Giving people the technology to do what they’re used to in a way that protects privacy is far more effective.

The 2020s are going to be the decade of consumer privacy. That debate, and the solutions we create as part of it, will touch every aspect of our lives. Technology will drive the solutions, not just the problem. Within ten years things will be completely different.

In ten years, we’ll be astonished by how primitive things were today. The business approach and the technology will be almost unrecognizable — though, we hope and suspect, surprisingly intuitive. We’ll be looking back to today and thinking this:

Bring on the 2020s!

--

--

Shoshana Maraney
IdentiqProtocol

Content and Communications Director at Identiq; Writer; Editor. Lover of words, wisdom and Weird Stuff. Fascinated by online fraud and identity for 5 years now.