Omidyar Network
Omidyar Network
Published in
12 min readMar 11, 2020

--

Image Credit: Lili des Bellons

By Erica Orange & Jared Weiner, The Future Hunters, and Eshanthi Ranasinghe, Exploration & Future Sensing, Omidyar Network

War and conflict, and the threat of both, remain realities of daily life for people in much of the world. However, research indicates that that global deaths from war and related violence have long been in decline. This is a trend that both historians and futurists alike point to when discussing how the human condition is improving. Much of this can be attributed to the changing nature of warfare itself. With advanced technology, modern global warfare is as much characterized by the threats of advanced weaponry and terrorism as it is by actual hu- man combat. But, as we look deeper into the changing nature of warfare, we see something else happening. Warfare — and cyberwarfare — is increasingly digital and autonomous, online, distributive, and subversive. While deaths decline, broader human impact rises. The theaters for war are changing, and the barriers for participation are dropping.

The future of warfare will increasingly be influenced by AI. Many applications still remain theoretical, due to technological limitations and the ethical considerations which cast a shadow over the widespread adoption of these technologies. However, many other applications are already well in development, while some are already having an impact. Combat will be increasingly populated by specialized drones and robots that are either fully autonomous or operated remotely by human pilots. And, within this context, AI itself may soon decide who lives or dies. We may be fast-approaching a future characterized by what some might call “algorithmic warfare.” The US Army wants to build smart, cannon-fired missiles that will use AI to select their targets, out of reach of human oversight. The project is called Cannon-Delivered Area Effects Munition (C-DAEM). Unlike laser-guided weapons, which utilize human operators, C-DAEM will find targets for itself. Currently, these types of robotic and autonomous applications are more easily deployed via air-based weapons systems, versus within what remains a a far more complex ground-based combat environment. We are also rapidly entering the era of unmanned naval warfare.

The US is not the only country developing these types of smart missiles and bombs. Russia is at the forefront of the race to AI-guided missile technology; Israel is currently developing the Spice-250 bomb; China has developed lethal autonomous drones; and European firm MBDA has developed “fire-and-forget” missile systems called Meteors to be deployed by several countries’ armed forces. Many protesters demand that such weapons be banned. These include the Campaign to Stop Killer Robots, a coalition of NGOs that want use of force controlled solely by humans. But, going beyond conventional warfare, there is perhaps no combat theater more directly influenced by AI today than cyberwarfare. The last three years have signaled a rise in automated bot attacks, including the first AI-powered cyberattack, which was detected in India in late 2017.

“Machine behavior” is a new field of scientific inquiry designed to study how artificial agents interact “in the wild” with humans, their environments, and each other. It was initiated by Iyad Rahwan, who directs the Center for Humans and Machines at the Max Planck Institute for Human Development — with the help of colleagues from robotics, computer science, sociology, cognitive psychology, evolutionary biology, AI, anthropology, and economics. The nascent field has not been in existence long enough to collect the kind of longitudinal data necessary to make concrete conclusions yet. But it is well on its way, and just in time, as autonomous systems are impacting more aspects of our lives than ever before, and the “behavioral” outcomes of these systems are proving hard to forecast by simply examining the underlying code or technological engineering. It will be important to see how advancements in this nascent field will influence the future of AI-driven warfare.

One of the ways in which warfare has changed most profoundly has been in the systematic blurring of human accountability and personal risk. If someone is remotely operating a drone, versus flying the plane in combat, how might their calculations of risk or ethical decision-making change? If something malfunctions with a partially or fully automated technology, who bears the responsibility? The rules of engagement change dramatically.

Traditional war is expensive, politically unpopular, risky, and rarely effective. But competition over land and limited resources, and the desire for security, global influence, and economic dominance, haven’t diminished; if anything, they continue to rise. States have been moving to more subversive, online forms of warfare in pursuit of objectives, via mainstream social media and sharp power tactics, manipulation of digital systems and smart architecture, and the dark web. ISIS was the first terrorist group to hold both physical and digital territory, and it served as a harbinger of things to come. Future prominent terrorist groups may be likelier to have extensive digital operations than control physical ground. ISIS operates as a pyramid consisting of four levels of digital fighters. And that is all part of its broader social media and disinformation strategy, designed to spread propaganda and recruit new members.

The World Economic Forum ranks massive digital misinformation, “digital wildfires,” as one of the world’s greatest geopolitical risks. And DARPA has developed its Social Media in Strategic Communication program. According to a report by researchers at Oxford University, the number of countries engaging in social media manipulation more than doubled to 70 in the last two years. Foreign meddling operations have thus far largely been the purview of state actors and their proxies, but other actors will have similar capabilities in the near future as AI lowers the barriers to entry.

As information warfare expert Brett Horvath shared with us, “This creates an environment where instances of social and economic volatility are increasingly enticing openings for bad actors with an agenda. That could be a rival faction in an authoritarian regime, domestic or international economic interests, or a foreign nation-state. As volatility attracts speculators to financial markets, social instability may be a magnet for a range of bad actors, armed with weaponized technology and an incentive to make a dangerous situation worse.”

But these efforts in engineered volatility may not be the most dangerous outcome. As Horvath goes on to say, “Perhaps one of the greatest risks is that neither the actors behind these efforts, nor independent researchers, have a scientifically reliable way to model the non-linear second- and third-order risks that result from constantly escalating influence operations. I worry that unless we move beyond simple detection and attribution of misinformation, and toward something that accurately models emergent risk, actors like Russia, China, and the US may not be able to understand the information warfare equivalent of ‘Mutually Assured Destruction’ before it’s too late.”

One of the easiest ways for non- state actors to manipulate public opinion will be through the use of increasingly sophisticated “deepfakes” — highly realistic and difficult-to-detect digital manipulations of audio or video.

Researchers have long been moving toward a future where human soldiers with enhanced abilities (e.g., exoskeleton suits) operate more like real-life superheroes than they do as conventional soldiers. The Pentagon’s Skunk Works is creating technology to build the “super soldier” of the future. This is part of DARPA’s new Squad X Core Technologies (SXCT) program, which could equip soldiers with unprecedented awareness, adaptability, and flexibility. And military robots inspired by various animals will soon work alongside South Korea’s human soldiers. South Korea’s Defense Acquisition Program Administration (DAPA) plans to incorporate “biomimetics” equipment into military operations by 2024

US Big Tech: The US Pentagon has long been trying to establish supremacy in the race toward advanced, autonomous, and AI-driven defense capabilities. In order to do so, it enlists the help of the world’s leading technological minds, enabled by the tech industry’s ability to poach talent from the defense and intelligence world. One of the most well-publicized examples is Google’s Project Maven contract, which helps intelligence analysts identify military targets from video footage. The project was disbanded because Google employees protested the company’s involvement; it remains a harbinger and warning of both the capabilities unleashed by these types of public-private partnerships involving big tech, and the internal employee unrest triggered. In late 2019, Amazon CEO Jeff Bezos provided a pessimistic outlook for the US if tech companies avoid working with the Pentagon. He, along with many others, believe that it is US big tech’s partnership with defense groups that is the more humane path: “We are the good guys.” His comments came during a time when Amazon was competing with Microsoft for the Pentagon’s cloud-computing JEDI contract.

Perhaps no tech firm is as intertwined with the US government defense function as Palantir. For years, Palantir has provided advanced, and controversial, data analytics and surveillance services to the US intelligence community and Department of Defense. This partnership is equal parts lucrative and secretive, and it is laden with myriad ethical considerations for how data is collected and leveraged.

Global Private Sector Tech: The conversation around big tech’s role in warfare is rapidly becoming more important — and complicated. But complications become even murkier as we watch this pattern play out around the world. Israeli spy tech is some of the best in the world, and has been used by at least 130 countries to invasively track and surveil external enemies and domestic activists. Companies like NSO Group, Verint Systems, Circles Technologies, Elbit Systems, and Fifth Dimension specialize in snooping on smartphones and laptops, infiltrating social media tools like WhatsApp and Facebook, and have sold systems to Bahrain, South Sudan, United Arab Emirates (UAE), Nigeria, Botswana, Mexico, Azerbaijan, India, and more. Israeli tech has also been used to disrupt elections.

In China, tech companies and government have little barrier be- tween them. As Chinese private sector tech quickly catches up to US tech dominance (and in fact already leads in areas like drones, 5G cellular networks, and more), we can assume that technology is available and accessible to the government as well. To add further complication, Chinese tech companies may have collaborations with foreign firms. For example, many US tech companies have had ties to Chinese tech companies that have ties to Beijing.

How will exports of warfare tech evolve with global political dynamics? Is every state actor as strong as the tech companies it can partner with and buy from? Small countries with limited military forces but deep pockets — or access to deep pockets — can be just as powerful and threatening as large ones. What are business, patriotism, and security in our new world? How do globalized business models and supply chains adapt to nationalist governments, and what are the implications?

One of the most important considerations when examining the future of warfare is how these emerging autonomous and gamified combat technologies and cyberwarfare tactics will become less prohibitive and expensive, and more accessible, than conventional weapons. The barriers to entry will be far lower for third-party, rogue nation-state or non-state actors (e.g., guerrilla groups, terrorists, hackers, or other groups or individuals that sit outside of our known parameters of public sector, private sector, and multilateral entities). Individual hackers with insidious goals could theoretically do as much damage to large governments, institutions, and companies as any other actor. The future is one characterized by cyber-insecurity, in that no single technological system — irrespective of how sophisticated — can be fully safe- guarded from outside threats. That said, prominent “hacktivist” activity, best epitomized by the notorious, decentralized, and global Anonymous hacker collective, has been on a steady decline over the last several years, due in large part to more sophisticated cybersecurity countermeasures and a fracturing of the group from within. Will security tactics be able to improve at pace with warfare tech?

A recent drone attack in Saudi Arabia targeted Abqaiq, the world’s largest oil processing facility, and the Khuaru oil field, which produces around 1 million barrels of crude oil a day. The Iran-backed Houthi group in Yemen claimed responsibility. Houthi rebels have used increasingly sophisticated drones for several recent attacks. In addition, Russian company Kalashnikov has developed a suicide drone that may revolutionize war by making sophisticated drone warfare technology widely and cheaply available.

There is also an enforcement vacuum that helps create the fertile conditions that allow such rogue actors to thrive. In conflict zones around the world, 1.5 billion people live under the threat of violence. UN peacekeepers comprise the second-largest military force deployed abroad, after the US military. But they are often given few resources with which to achieve their objectives.

The Future of Wargaming “Talent”: What will the future pipeline of wargaming talent be, from a human perspective? If war is being increasingly fought digitally, will it change who fights the wars? Will it mean more female soldiers? Older soldiers? Physically disabled soldiers? Will it allow smart military minds who are incapable of conventional physical fighting (e.g., amputees) to potentially return to the battlefield and still engage in “com- bat?” And what new skills will people have to learn? It is wholly conceivable that teenagers skilled in advanced videogaming (both gameplay and design) will become compelling candidates for recruitment. What would a draft look like in the future?

The Future of Diplomacy: In a world where technology is blurring the lines between what is real and what is fake, what will diplomacy look like? It is already difficult to discern between authentic and forged images and videos of the world’s most prominent leaders, and such enabling technologies are still fairly rudimentary. What happens when it advances even further? Will formal diplomacy as we know it continue to morph into something that is enacted through various other technological platforms (e.g., Twitter)? Will we have to develop entirely new protocols to safeguard diplomatic communications? Trust has now become a luxury; how will we optimize it in the future?

Peace in a New Decade: The most pragmatic justification for weaponizing all of the technologies contained herein will be their ability to make combat more efficient, inexpensive, and precise. Going beyond that, the future perception of these technologies by the public will be driven by one key metric: Will they make the world safer, and ultimately decrease the toll on human lives? But more questions remain. Will these technologies require fewer people to be employed in traditional military capacities? Who is ultimately responsible or liable for decisions that fully or semi-autonomous weapons make? If the center of control is taken away from humans themselves, does the impact of war become less personal, and thus more justifiable? We might see combat start to take on the features of a videogame. Conflict might also become more unpredictable, if it is increasingly enacted by rogue or third-party actors, through means like cyber-warfare, terrorism, or information warfare. How do we enable peace in a new decade? When we look at this trend in isolation, it is terrifying to imagine its escalation, and yet we know trends are rarely as simple as that. Counterreactions emerge as trends play out, impacting what occurs. The end of this story is not yet written. We can still actively shape it.

This is Trend #5 of 5 in Omidyar Network’s Exploration and Future Sensing 2020 Trends to Watch. View the full series here.

Consider Explorations an open space for discussion. We welcome new perspectives — especially those rarely heard, contradictory, relevant, and tangential — and most of all, conversation and partnership to build the future we want, one that includes and empowers us all.

Special thanks to Brett Horvath, information warfare expert and founder of Guardians.ai, for his contributions to this brief.

--

--

Omidyar Network
Omidyar Network

Omidyar Network is a social change venture that reimagines critical systems, and the ideas that govern them, to build more inclusive and equitable societies.