Authoritarian Technology: Attention!
Part 1 of 2
I remember when the dream of the Internet and personal computing was about democratization and the distribution of power to foster exploration and creativity. Watching a demo of the first graphical web browser in 1993, I imagined the opportunity of giving a voice to anyone with access to a computer. We did not anticipate the erosion of democracy. Our relationship with digital technology has become increasingly authoritarian and controlling, as we wittingly and unwittingly submit. It is not just the unintended consequences of technological progress that are problematic, but also the ramifications of intended use. Our need for connection and our desire for convenience are being co-opted in the pursuit of technological and corporate dominance.
It is overwhelming to comprehend the far-reaching, toxic aspects of this internet relationship — inclusive of economic, consumer safety, public health, and national security threats. We feel the pull to check our phones and read concerning studies about increased anxiety levels of our youth. We see the invasion of privacy, polarization, erosion of freedoms, and undermining of elections and democracy throughout the world. We debate the impacts of artificial intelligence on the future of work. Less obvious are the influences on our intellectual capacities, personal authority and agency. Platforms meant to connect the world may actually be dividing us and fragmenting the shared context that contributes to a civil society.
Digital technologies are no longer relatively benign with bugs or design flaws which are ‘merely annoying’. Interjecting a new type of virus into our population is appropriately deemed biological terrorism. Deep fake video editing can now completely eliminate the ability to believe what one sees. Yet, a Buzzfeed video that puts words in President Obama’s mouth is passed around the internet generating more likes than chills. Perhaps we don’t think about software and information that way because they are not tangible. But the risks to our intellectual, physical, and emotional prosperity are real.
To reclaim control of this relationship we must acknowledge the complexity and range of impacts, the underlying dynamics of ‘technology capitalism’, and the lens through which we view technological progress. We need short-term incremental action as we work on longer term structural change. We must be willing to revisit many of our cultural and economic assumptions including design choices and priorities, use of technology as co-parent and co-educator, and policy definitions of antitrust, competition, and consumer harm.
As an entrepreneur and advisor, I have been a successful disruptor of legacy solutions and have experienced failure when up against firmly entrenched companies. While CTO of Cisco and a board member at FedEx and Disney, I had the opportunity to work with powerfully dominant corporations in different industries. As have so many, I have personally been a victim of controlling behaviors, sometimes aware of the choices I was making and at times not. I have been through a number of technology and economic cycles. This one feels dangerous — with more ubiquitous and adverse effects for us as individuals and society. Thus, I am raising my voice to help increase awareness of the problems and to contribute ideas to the discussion of solutions in Authoritarian Technology: Reclaiming Control.
The pervasive role of technology in today’s society
Most of the current focus on the nefarious effects of technology is on Facebook, data privacy, and elections. Yet our unhealthy relationship with digital technology goes beyond any one company or service. The combination of multiple new innovations in computing and business practices has created a rich environment — one that is ripe for abuse. Ubiquitous mobile devices with addictive user interfaces drive engagement and demand attention. A shift to ‘peer-to-peer’ based distribution has lulled us into a false sense of trust. News feeds and recommendation engines, powered by aggressive data aggregation and predictive artificial intelligence, influence our feelings and thoughts.
Advertising driven business models, a winner-take-all culture, and increased consolidation drive the large corporate players. Inspirational and aspirational Silicon Valley success stories have created a new version of the American Dream of which the overwhelming majority must continue to only dream. The current ethos around technology has individuals and many political and thought leaders submitting to the belief that we dare not even talk about slowing anything down. Individually and collectively we have put technical innovation on a pedestal, focusing only on the positive and treating each new step as inevitable. The increased interdependence of technological innovation and capitalism has created an imbalance that will be even more problematic as the power and ubiquity of AI grow.
It is no longer enough to succeed. Companies must dominate
It wasn’t always this way. I was fortunate to begin my career in the Silicon Valley of the 70’s as a Stanford graduate student on the team developing the Internet protocol software. I felt the excitement of unlimited possibilities as we exchanged the first packets of information with colleagues in Boston and London. I went on to create companies and products that were instrumental in building the infrastructure enabling today’s digital services.
The entrepreneurial culture was aggressive and competitive, but also collaborative as the open system philosophy behind local area networking, the Internet, world-wide-web, and Unix drove broad innovation. Reflective of the industry perspective in 1990, Steve Jobs said technology was a ‘bicycle for our minds’; the computer was a ‘remarkable tool’ enabling humans. Tech companies sold primarily to corporate and government enterprises that had the power to negotiate for terms and product capabilities. When large dominant companies used their power unfairly, carefully considered government intervention created opportunities for new players. Antitrust actions forced change at AT&T, IBM, and Microsoft enabling the PC and Internet industries to take hold.
During the ‘dot-com boom’ of the late 90s, the power dynamic changed as a mainstream consumer market fueled the explosive growth of new platforms and the industry’s influence expanded to all aspects of our lives. There was no collective representation to stand up for consumer rights. The three critical services developed on top of the internet — mobile, social, and cloud computing — were built on proprietary platforms. ‘Walled gardens’ created around data, algorithms, and communities made it hard for users to leave and for competition to break through.
As the tech industry grew in wealth, there was a shift in how its power was wielded. The increasing level of money to be made changed incentives and leadership styles became more self-interested. Growing tech economic influence was reflected in headlines such as ‘software is eating the world’ and ‘every company is a tech company’. Celebrity status reinforced arrogant tendencies.
Tech supremacy has become a thing. It is no longer enough to succeed; companies must win and dominate. Priority is placed on locking in value through ‘network effects’ — increasing the value of a product or service as a function of the number of people using it. The traditional definition of the term implies increased value to users. But when network effects are driven by proprietary platforms (vs open standards) to create defensibility and limit choice, it is the provider that accrues more and more value. Creation of multiple layers of network effects by the largest players has fueled incredible growth. Users, content providers, and marketers all need to use services because that is where others are — even when they are taken advantage of. Innovation and market alternatives are limited and profits are accrued by the few. Those that started out resisting have become more of ‘the Man’ than their predecessors ever were.
Eliminating friction
Maximizing scale and speed is now integral to the industry’s culture, incentives, and design principles. ‘Gamification’ and ‘growth hacking’ have become coveted skills. The mantra is ‘make things frictionless’. Automate everything. Value the number of connections over depth of interaction. Employ ever more sophisticated psychological techniques and minimize obstacles to enticing and retaining users. All barriers to ingesting data and to onboarding or distributing content, ads, or marketing campaigns are removed. Lowering barriers does give voice to more people and organizations. But missing are the checks and balances needed to protect us from ourselves and impede those who use these tools against us. Anticipation of potential problems is hard to automate so companies rely on detection alone. By the time we become aware of problems, it is usually too late to prevent harm.
The technology industry’s drive toward frictionless connectivity plays into society’s increasing focus on the pursuit of convenience and our fundamental human need to connect — to be known and understood — in order to be safe. We have a natural desire to avoid pain and discomfort; we choose ease of use even when we know the downsides for ourselves and our environment. We give up power, privacy, and agency to make it easier to login, shop, connect, or make decisions as to what to read, watch, buy, or vote on. Quickly we become dependent on features that we didn’t know we needed.
We seem to have collectively forgotten that the right level of friction is a good thing. It is necessary in so many aspects of our lives — from brakes on a bicycle to setting boundaries in personal relationships. Connectivity without containment leads to uncontrolled spread of wildfires, floods, infectious disease, and misinformation. Rules and norms are the guardrails that enable society and democracy. Some resistance between our darkest thoughts, feelings, and impulses and what we post or tweet enables a civil society and allows for more open debate. Obstacles provide a catalyst for growth. Authentic relationships involve effort to work through problems. In order to connect with our inner lives in a healthy way we must to be willing and able to tolerate uncomfortable feelings without immediately seeking distraction or escape.
A deeper look at what we are losing
Relationships that begin on a level playing field may degrade into diminishing or stunting growth, or all out abuse. Those being dominated often fail to realize that they are slowly losing ground. Positive and negative reinforcement, emotional abuse, isolation, and disinformation are some of the many tactics that authoritarians use to paint an alt-reality and mold their subjects. Today’s technology is seductive. Digital services that are so integral to how we engage socially, professionally, and politically make us feel well connected and informed. It is all too easy to ignore or dismiss the costs to our sense of reality, trust, and emotional well-being.
What began as social networking, allowing us to share our lives with family and friends, has evolved into social media, social commerce, and more — disrupting industries from journalism to retail shopping. As we quickly scroll through our feeds of images, memes, and attention-grabbing headlines, the lines between fact and fiction are blurred. Information, entertainment, marketing/commerce, and community engagement are curated for us, intermingled, and pushed to a small screen that we are rarely without. Algorithms beyond our understanding or control define our view of the world through our phones. Reality has become dangerously fluid and illusive as we no longer all agree that facts exist or matter.
The ability to develop trust over time is a key aspect of healthy human relationships. Demands of absolute loyalty and abuse of trust are common mechanisms to achieve control. We blindly trust digital platforms, yet the decision processes and motivations behind them are opaque. We hoped that the shift away from institutional brands toward social distribution through ‘friends’ would build trust. It is now undermining it as we multitask and quickly like or share without considered thought. Receiving information from someone we trust does not mean that the information can be trusted. Democracy depends on an informed public.
Authoritarian leaders, bosses, or partners use knowledge of sensitivities to control and cash in on their ‘subjects’. Many tech designs exploit human nature to dehumanize and degrade us. Incessant notifications, scrolling timelines, and brightly colored games hijack our brain chemicals, much in the same way a narcotic or sugar does, ensuring our maximum attention and use. With ‘addictive’ technologies, casual use of playing Candy Crush and Fornite or viewing Insta stories can evolve into excess self-medicating and escape, numbing to an extent that results in loss of agency.
Fear and anxiety make us more susceptible. We crave relief. Yet watching YouTube videos or reading tweets may actually leave us feeling less safe. Our teens suffer from ‘FOMO’ (Fear of Missing Out) while paranoia and reactionism is stoked by ‘FOLESS’ (Fear Of Losing Economic or Social Status). ‘Going viral’ as a measure of importance has taken over our editorial systems online and off. Urgent notifications and breaking news overload our systems. What people like has replaced what actually matters. In an increasingly frenetic world we all strive to be known, yelling louder to be heard.
Not only are we seeing a significant increase in physical optical myopia from staring at our screens, but we are also becoming generally more ‘nearsighted’ through constant reinforcement of instant gratification. Our attention spans have shortened and appetites diminished for anything more complicated than a tweet or emoji. Yet the world is ever more complex. Addressing grand challenges, from inequality to climate change, depends on us paying attention over a sustained period of time and embracing complexity, not just simple answers. Solutions require us to be willing and able to choose delayed benefit for many, over short term convenience for ourselves. What happens if over the next few generations we lose the capacity for intentionality and for facing difficulty?
I have heard of young children issuing commands to a friend as if they were an Alexa device. I worry that we are the ones that are adapting to the limitations of the digital domain. A key early development milestone is cooperative play. Yet our phones, audio headsets, and VR devices seem to actually take us backward as we isolate ourselves each in our own world. As we rely on medical or fitness devices for data about our health, do we train ourselves to focus on just numbers and lose perspective of the overall picture? This input can be valuable and even life-saving. But we become too data-driven if we stop paying attention to our real feelings and instincts, assuming that measurements will tell us how we should feel and think about our personal progress. As our personal authority declines we become more open and available to various types of authoritarian leaders in our political, professional, social, and religious lives.
Data is power. AI is the weapon
Money is often used to dominate. Data is the currency in our relationship with digital technology. Of concern is not only the level of raw data collection, but the aggregation of data from various sources. Web sites visited, purchases, places frequented, and verbal commands to play music, call a colleague, or watch a program are all combined with likes and searches to provide a view into our private lives. The relationships between the companies that make money off of gathering data, trading it, and/or using it are often not clear.
We should indeed care about the loss of data privacy — whether our data is safely guarded against break-ins or shared with third parties. We are exposed to over-zealous surveillance, reputational damage, and identity theft. But also dangerous are the uses by advertisers and the services themselves. Ever more sophisticated capabilities persuade us to ‘willingly’ give up more data without understanding that the flip side of personalization is micro-targeting — a technique used to improve the utility of online advertising. But the targeting is not limited to selling us products. Increasing use by political actors fragments our discourse. Bad actors are enabled, interfering with elections.
Mass manipulation of people’s emotions at relatively low cost has gone beyond irritating marketing campaigns to a level of disinformation and polarization that erodes democracy and society. Through well connected nodes in private and public communities, conspiracy theories, anger, and hate jump from fringe groups to the mainstream — and at times onto the streets. It is not just paid advertising that is problematic, these platforms encourage abuse. The design of standard features such as liking and sharing tap into the worst aspects of our nature and we become complicit in turning up the heat.
Today’s AI driven curation and filtering systems are another form of data-driven targeting that have more insidious effects on our intellect and psyche. ‘Personalized’ recommendations are based on predictive models for how we fit into society — actually reinforcing homogeneity over individuality. Instead of knowing ourselves, we unconsciously reveal our feelings and then look online to figure out who we are and what we believe. Recommendation engines that ‘auto play’ incrementally extreme content to keep us engaged are used to radicalize the most vulnerable.
While consuming information and entertainment or buying products, we pick from a curated selection and no longer widely browse and explore. We give up the learning, sense of surprise, or awe that can come from experiencing something truly new. Today’s services not only provide us with all of the answers, but autofill and prompting frame our questions. ‘Filter bubbles’ tell us what we want to hear, undermining a shared frame of reference through which we interpret the world. Subjects under authoritarian rule are often kept in line through isolation and limit their ability to question and branch out. As we increasingly rely on our phones, we limit the growth of our minds.
Reversing the cycle
Resolving an unhealthy relationship begins with increased awareness, standing up for what matters and demanding change. There is no single fix. The pervasive influence, combined with the benefits and complexities of these technologies does not allow for simple remedies. The accelerating pace of technological change necessitates a different model for evaluating new technology. Authoritarian Technology: Reclaiming Control is aimed at helping to move the discussion forward.
It is time to take a hard look at what we value. Unfortunately, we can not take respect for our rights, needs, and personal agency for granted. We must act now to stop the degradation of our minds and our society. These ‘tools’ that are only getting more intelligent and powerful must not become ever more controlling. We can break this cycle of control for ourselves and future generations if we are willing to apply significant resistance and sacrifice some speed for intentionality. Unbridled technology capitalism is threatening democracy and our humanity.