Image by Susan Kare from my 2014 talk Balancing our Digital Diets

Authoritarian Technology: Reclaiming Control

Part 2 of 2

Judy Estrin
9 min readSep 11, 2018

--

Our relationship with digital technology is not healthy. It has become increasingly authoritarian and controlling as we wittingly and unwittingly submit. As I describe in Authoritarian Technology: Attention!, we have put technical innovation on a pedestal, with dire consequences. The following ideas are intended to help frame and fuel the debate around mitigating the risks.

As much as we may want quick or simple answers, our internet relationship cannot be fixed with a self-help book. This is not just about technology anymore. To quote Albert Einstein, ‘We cannot solve our problems with the same thinking we used when we created them.’ More technological magic alone is not the answer. A culture of ‘move fast and break things’ that is focused on fixing the last mistake puts us all at risk. A more critical, nuanced model for how we develop, use, and moderate new technology is required. We need to look at not just pieces, but interconnections and potential consequences. We should be thoughtful of the side effects of any remedies themselves.

While some believe that we as a society do not have the will or capacity to change our current path, we must try. We can vote with our feet/pocketbooks to incentivize tech companies to make changes and push legislators to move aggressively to stand up for us. The technology is new and the rate of change is accelerated, but we have faced complex challenges before. Industries such as food, tobacco, and carbon-based fuel have all been pressured into making significant improvements.

Where to start?

As with repairing other types of relationships, a framework for action must be multi-dimensional. It should include regulations to adjust incentives and behaviors, as well as government and non-profit funding for ongoing research and education. Six broad areas need work.

We must act now to address known harms by:

  • empowering people with information and tools,
  • protecting the vulnerable,
  • demanding incremental changes from the industry.

At the same time, we must work to change the rules of the relationship by:

  • adjusting the underlying power dynamics to ensure competitive markets,
  • establishing a field of research to provide understanding of the byproducts of current and future technological innovation,
  • increasing the technical capacities of government and civil society.

Empowering and Protecting Humans

We need specific information and tools to protect ourselves, to help us say ‘enough’, and look up from our phones (metaphorically and physically). New applications that limit use are good, but most apps still have us relying on them to tell us what to do. Smartphones and online services may have become an integral part of our lives, but too few people understand the more insidious impacts much less the priorities of the designers. We need a range of programs teaching us digital literacy and safe use — like age-appropriate sex-ed or driver’s education — with courses in schools and independent continuing education offerings. The curriculum should go beyond teaching coding or using PowerPoint. It should also include subjects like why one might want to turn off notifications, etiquette for sharing articles, dealing with online bullying, basic security practices. Covering more complicated issues such as early signs of addiction or how to discern various types of disinformation is also required.

Content should be informed by independent studies on actual benefits and harms. As with other types of addiction, we need effective intervention and support programs. Armed with the right information we can make more conscious decisions about the trade-offs of today’s 24/7 connectivity. Then parents can guide the use of technology by their kids. We also need a better understanding of the appropriate level of technology in schools, especially early education, in order to not expose our children to long term harm.

Incremental changes from tech could cede some control back to us

What changes in behavior should we seek from the tech companies? In an ideal virtual reality, they might choose to disrupt themselves, take a short-term shareholder hit, scale down and slow down. Their systems might be redesigned to prioritize our needs over maximizing today’s growth and profits. We can dream, but in the meantime we should push for incremental progress on three fronts: better industry transparency and accountability, more investment in prevention, detection and remediation, as well as changes to some of their current designs.

We need visibility into the types of data gathered, how long it is stored, and who has access. Privacy policies and controls must be designed to truly educate users, encourage them to be more protective, and not to obfuscate with the goal of getting people to share more. Maybe these services do not deserve to ‘know’ us so well. The industry should be experimenting with more parsimonious data collection, decentralized models for data ownership, and independent identity services. The jury is still out on impacts of the new European Union General Data Protection Regulation (GDPR), but at a minimum it will give us all a chance to learn what does and does not work.

We need mechanisms to help us view content with a better balance of trust and skepticism. Who or what is generating or sharing the content we see (humans or bots), and who or what is deciding what we see (humans or algorithms). How are the decisions made? Maybe bots shouldn’t be legally allowed to impersonate humans.

We deserve respect as people, not just users, and products that are not purposefully addictive. Designers should reconsider some of the current givens of digital services. Alternatives for how we navigate the massive amount of content competing for our attention, moving beyond the current algorithmically selected feed, might help us to break through our filter bubbles. Wouldn’t we be better off with limitations on micro-targeting and a more nuanced way to assign value and communicate our feelings than today’s ‘like’ buttons? How about putting some resistance back into the system with speed limits or bumps, as is done with programmatic stock trading. Caps on the rate of re-tweets or how often a given tweet can be forwarded might encourage more thought. Instead of pouring fuel on a fire by amplifying a post that is popular or outrageous, the platforms could reduce visibility into sharing. What is ‘trending’ shouldn’t consume the next news cycle or bring out the worst in us.

Changing the rules of the game

While we welcome incremental changes and encourage social responsibility initiatives, they are not sufficient. We should be skeptical of promises of change that are made in the interest of maintaining control. We need to consider the possibility that the relationship can’t be fixed with small adjustments. Elements of today’s infrastructure may be just too big to exist safely. What if its algorithmically driven scale and speed make it impossible to be controlled and governed in its current form?

We need to change the rules to rebalance power. Our relationship with technology is only going to get more stressed, and changes more critical, as artificial intelligence becomes the engine that powers everything around us. Talk of legislation and regulation is polarizing; we often move too fast or too slow and cater to too many (or the wrong) agendas. But at this point we have no choice; self-regulation alone is not likely to work. Many leaders want to ‘do good’ (or at least do no harm), but companies are not incentivized to give up power, control, or profits. As with other industries such as big oil, food, or pharma, where the relationships between industry and consumers are complex and at times unhealthy, third-party intervention is required to create long-term change. Calls to just ‘break them up’ may be too simplistic. Would we do this along the lines of reach, scope of services, influence, or level of data aggregation? The time it will take to debate and update our current antitrust models mandates parallel paths of action.

One of the strongest incentives to change is the threat of competition. We need more choice, yet the entrepreneurial ecosystem has been weakened by the size and dominance of today’s tech companies. Their power is protected by proprietary, vertically integrated systems along with cash positions or market valuations that allow them to absorb innovative start-ups at a very young stage. New companies, like seedlings in a garden, need time to establish themselves and special care to grow. If the founders of Google or Facebook were starting now they would likely be consolidated into one of the current giants with many of their new ideas never seeing the light of day. We need to restrict overly dominant players from expanding further into new application spaces, and to determine how to break some of the ‘network effects’ of platforms and marketplaces that lock us in and competitors out.

We should evaluate both new options for legislation and existing laws that might be applicable. How do consumer protection issues around addiction and health compare to tobacco, food safety, food labeling, alcohol sales, or advertising? What can we learn from previous media industry actions relating to ownership limits, advertising, or yellow journalism? Are there aspects of regulation of the complex financial services or health care industries that apply? What might be the responsibilities of an information fiduciary? Do we need a digital protection agency? Perhaps the actual highways could be a model for today’s information highway given the various aspects of safety involved. We have rules of the road and education that protect us from each other, mandatory seat belts and car safety standards when those rules are not enough, and car emission standards to protect the environment.

When designing new regulation careful consideration should be paid to not ending up favoring established players that can best afford the resources necessary to comply. We must consider not only the relationships between the companies and their users, but also the various interrelationships including the big industry players, smaller businesses, and other disrupted industries (eg. journalism). We must not ignore the role that technology innovation plays on the global stage.

We don’t know what we don’t know

Given the pace of change, we can no longer design static retroactive solutions, but require forward-looking policy and organizations that can adapt more rapidly. We need independent and collaborative resources in a field of study to evaluate the impacts of technology on individuals, society, and democracy, providing ongoing information about problems and options for mitigation. The funding for these efforts must be transparent. We are suffering today from not having sufficient research studies about the effects of technological change (and ignoring those that exist). Actions must be informed by data — not tech optimism or panic.

To ensure that we are providing checks and balances on the technology industry in a smart way, we must have deeper and more current tech expertise advising the relevant (and any new) agencies, Congress, think tanks, and academics in non-technical fields. This expertise cannot exclusively come with a corporate agenda, either directly or through corporate funding. These professionals must be focused on understanding the implications of tech, not just applying it. They need to be trained to collaborate with other disciplines — including media, legal, political science, and psychology — and not try to force human problems into a digital representation of the world. We also need education programs for all engineers to include ethics and system thinking so they consider side effects, as healthcare practitioners do.

Reclaiming control

The tech industry has elevated the values of maximum connectivity and convenience — without enough consideration for what we lose. Artificial intelligence systems inherit these values and manipulate basic human needs and desires to exploit us. Failure to acknowledge this exploitation protects those in power. Facing up to these problems is an opportunity for growth and innovation in the private and public sectors.

In healthy relationships, knowledge is used to bring out the best in both parties. Imagine if those in charge of developing digital technology prioritized us by demonstrating understanding and respect for all aspects of human nature. Tomorrow’s systems would be designed with the recognition that our need is for deep connectivity not expedient connections. They could help us build — not distract from — real human relationships and help us enhance all of our intellectual and emotional capacities. They would balance connectivity with containment of harming elements and would use the knowledge of our vulnerabilities to help protect us from ourselves and others. We might then have a relationship that brings out the best of democracy and humanity. Let’s reclaim the concept of ‘network effects’ and apply it to a healthy human-centered technology movement that distributes more value to all of us.

As with other crusades for change, from green energy to healthy food, it is up to us to become more aware and responsible users and leaders. If we hope to give our children the future they deserve — to address our obesity and diabetes epidemics, the opioid crisis, inequality and climate change — we must be intellectually and emotionally strong. Our future depends on our willingness to act. We must choose the path that protects our democracy and our humanity — imperfections notwithstanding.

--

--

Judy Estrin

Network technology pioneer, entrepreneur, advisor. CEO, JLabs, LLC (www.jlabsllc.com). Author, Closing the Innovation Gap