How will we design digital services and products in the era of surveillance economics?

Jana Velozo
Design Globant
Published in
7 min readNov 11, 2021
[Image by Cerillion Skyline]

In an ever-increasing dystopian world, with more and more control of corporate giants through the use of technology, we have witnessed the strengthening of the post-truth era, the manipulation of behavior, and the emergence of the surveillance economy. With a 220% increase in the use of digital applications and devices, people spent more than 1.6 trillion hours on cell phones in the first half of this year. According to the Digital 2021 report released in October, 62% of the world’s population is already online. We currently have 5.29 billion people using cell phones, 4.88 billion using the internet, and 4.55 billion actively using social media. The COVID-19 pandemic has further accelerated this process and now, more than ever, we are dependent on digital systems for work, education, communication, entertainment, transportation, news, shopping, health, financial movements, food, and even sleep. We need to start asking more profound questions about the impact of the products and services we’re helping to build, and what our role will be in the next decade.

In the past, the industrial revolution helped us overcome our physical limitations and muscle strength. The technological revolution we’re living today is helping us overcome our mind’s limitations, extending and expanding human cognitive function. There is no doubt about the benefits that technology has brought: human communication and connection tools, drones delivering products in inaccessible areas, sensors in vehicles reducing accidents, high precision surgical robots in the health area, smart homes, among many other things. The goal is not to vilify technology, but to realize that there are better ways to move forward with it, after all, “Artificial Intelligence”, Big Data, and Deep Learning (which simulates human neural networks in machines) are here to stay. I believe that we, designers, have the social and moral affirmative responsibility with the products we create. What ethical considerations should we make?

Two fundamental rights need to be considered: the right to privacy, which in this context means personal control over the use of citizens’ personal information, which should be the owners of their data, including those under surveillance; and the right to self-determination, which refers to autonomy, decision-making power and control over one’s own destiny (social, political and economic). Are we designing products that defend privacy and data security, and that protect our fundamental human rights? Are we allowing people to be informed and to control when and how will data be collected, stored, copied, analyzed, used, sold, and destroyed by companies? What are the social consequences, and what would we continue to lose without becoming an active voice in protecting and regulating digital product data?

“At no other time in history have the richest private companies had at their disposal a widespread global architecture of ubiquitous computing capable of accumulating incomparable concentrations of information about individuals, groups, and populations, sufficient to mobilize the axis of monitoring to command human behavior remotely and on a large scale.”

Shoshana Zuboff

According to Shoshana Zuboff, a social psychologist and professor at Harvard Business School, surveillance capitalism provides free services to the population in digital systems that monitor their activities and behavior, understand their personalities and vulnerabilities, and feed manipulation engines called “machine intelligence” to create and shape behavioral markets, not only in the digital sphere but also in the physical world. The goal is to automate our behavior and profit from our choices. The more data algorithms obtain from our lives through digital devices, social media, and IoT artifacts, the better they work and the more imperceptible the mechanisms of influence become. One of the big problems is that they are sneakily collected and controlled: they listen on built-in microphones, watch and record on built-in cameras, experiment without our knowledge and, most importantly, without our consent.

The General Data Protection Act (GDPR), which is being implemented, defines consent as “any freely given, specific, informed and unequivocal expression of an individual’s choices regarding the processing of his or her personal data for one or more specific purposes. Consent must be given as an expression of actual choice:

  • Through an easily accessible form, with clear and understandable language
  • With explicit affirmative action and the purpose of processing fully explained to the individual
  • No influence or repercussions that may affect the individual’s choice, and no precondition for the service to be rendered
  • With the possibility for the individual to withdraw consent as easily as it was given, without negative consequences

Data Privacy considers the “proper” handling, processing, storage, and use of personal information, and Data Security considers the protection of personal data against any unauthorized access by third parties or malicious attacks and exploitation. Both are necessary for the regulation and protection of digital data but are still absent in various digital products and services that we design and use. According to the Ontario Information and Privacy Commission (IPC), there are some fundamental principles of Privacy Design that should be applied in the creation of digital systems. Privacy should:

  • Be proactive and preventive, not reactive, and anticipate privacy issues before they reach people
  • Be the default configuration built into the system
  • Have positive-sum with total functionality and avoid dichotomies
  • Have its standards visible, transparent, open, documented, and independently verifiable
  • Be people-centered, open, visible, and transparent, with full lifecycle protection and end-to-end security

Documentaries such as The Social Dilemma and Hacked Privacy have shown us how the use of technology and manipulation of behavior on a large scale through tools such as quantum computing, “Artificial Intelligence”, Big Data, and Deep Learning are changing our perception of reality, how they are interfering with our behavior and impacting political and economic systems. It’s also restructuring the labor force, increasing income inequality, contributing to the imprisonment of people for their ethnic origins, opinions, races, sexual orientations, and divergent religions, threatening democratic political systems, feeding autocracies and totalitarian systems, amplifying social problems and creating an extremely polarized world. Technology is being used as a tool of manipulation, control, and power that serves private interests. And in a capitalist world, the logic of capital accumulation defines the rules of the game.

“Small differences in adjustment, carefully applied consistently, have a cumulative effect over time. The manipulation of behavior can change the outcome of elections, modify the perception of truth, and ultimately sabotage human society”.

Jared Lanier

One of the possible tools we have as designers of digital products is the application of the Privacy Impact Assessment (PIA) at the beginning of any project involving personal data. Personal data is information that relates or may relate to an individual, which can be directly or indirectly identified through that information or in combination with other information, such as name, address, IP number, etc. The objective of PIA is to analyze the impacts on the privacy of the individuals involved, seeking to minimize them to the maximum, also respecting the principles mentioned above. The main steps are:

  • Trace and analyze information flows
  • Identify any related privacy and risks
  • Consult people who will work or be affected by the project
  • Identify and evaluate solutions to overcome or remove these risks

It is critical to involve UX Design, UX Writing and UI Design professionals, among others, so that the already existing interfaces comply with LGPD, as well as to design the new interfaces in accordance with the General Data Protection Act, in a comprehensible and transparent way for people. Some significant points that need to be foreseen are:

  • How and which data (and cookies) are collected
  • How data (and cookies) will be used, stored, and shared
  • What are the data protection rights
  • What is the privacy policy and how to change it
  • What are the possibilities of data consent (and cookies)

How can we rethink the collection, storage, processing, and use of data to be less sneaky and intrusive, and more transparent? What can be done to guarantee the consent and return the autonomy to the data carriers? How can we interfere with the design of these systems to prevent people from becoming hostages to behavior manipulation? I have more questions than answers and the dilemmas are many, but I believe that the digital world can exist without surveillance capitalism.

We need to find ways to modify dystopic structures and generate more decentralization, randomness, transparency, unpredictability, freedom, and diversity in the design of digital systems. As well as be part of the solution and lead the creation of networks, services and products that promote emerging, distributed and sustainable organic systems of rebalancing the world, aiming at the preservation of autonomy, freedom and, especially, the human experience.

Reading Recommendation

The Age of Surveillance Capitalism, by Shoshana Zuboff, 2019.

‍Ten Arguments for Deleting Your Social Media Accounts Right Now, by Jared Lanier, 2019.

‍Automating Humanity, by Joe Toscano, 2018.

Weapons of Mass Destruction, by Cathy O’Neil, 2016.

Radical Markets: Uprooting Capitalism and Democracy for a Just Society, by Eric A. Posner, 2019.

‍Your Rights Matter: data protection and privacy, FRA Fundamental Rights Report 2020.

‍‍From Privacy to Profit: achieving positive returns on privacy investments, Cisco Data Privacy Benchmark Study 2020.

Reports

Digital 2021 Report, Data Reportal 2021.

How COVID-19 Has Changed Consumer Behavior on Mobile Forever, App Annie Report 2020.

--

--

Jana Velozo
Design Globant

I believe in a world where people matter. So I’ve built my career on one simple principle: to craft meaningful experiences, services & products for people.