How dashboards could destroy world peace — and how to prevent it

Martin Waehlisch
Futuring Peace
Published in
7 min readMar 20, 2021
Illustration by Mario Wagner for UN DPPA

Centuries ago, before gasoline-powered automobiles took over the roads, “dashboards” were installed in the front seat of horse carriages to prevent mud and rocks from being splashed (“dashed”) onto drivers and their passengers. They were later replaced with sophisticated control panels in cars, and a new era of instrumentation ingrained technological innovation in everyday life. Today, dashboards are omnipresent. They help us to program our washing machines, assist us in directing the Internet of Things, and even support humanity in bringing robots to Mars. But like all blessings of technology, they can quickly turn into a curse. This is especially the case when the dashboard itself becomes an object of admiration and is seen as a solution to everything. A dashboard needs to be seen for what it is: just one tool, among others, that can help navigate complex problems.

A pioneer in the field of human-computer interaction, Lucy Suchman, Professor Emerita of Anthropology of Science and Technology in the Department of Sociology at Lancaster University, has been investigating the issue of ethical robotics and techno-cultures of humanlike machines since the early1980s. As a member of the International Committee for Robot Arms Control, she has critically assessed ethical and social dimensions of the destructive power of lethal autonomous weapons. The following recorded conversation with her was an attempt to shed light on humanity’s complex fascination with technology. We talked about why dashboards will destroy world peace, the phenomenon of tech fetishism, and the impossibilities of Artificial Intelligence.

Control panel for live interpretations in the United Nations, 1953 (UN Photo)

Martin: Lucy, we are having this chat remotely facilitated by computers and are dependent on the ‘Gods of the Internet’ for a good connection while we are staring at our screens. Is it too late for us to save the world from being engulfed by technology?

Lucy: Well, we live in a time that is fixated on techno solutionism. There seems to be a perpetual fantasy of technologies that will resolve all the really difficult and messy issues that we struggle with as humans. So, we need to be mindful that there are limits on what technologies can do, and vigilant about our dependency on technology over other forms of relationship.

Dashboard of the 1978 Citroen Visa (Reddit)

Martin: Wait, you don’t think new technologies can help us gain world peace?

Lucy: One thing that’s come up in our conversation is what we might call a growing dashboard fetishism. This is a figure of the dashboard, or control panel, that glorifies it as the perfect solution. It’s very much a fantasy of command and control. One can be sitting in one place and have access to and make decisions that affect multiple, distant places. Ministries of Defense, not least the US Defense Department, are dreaming of a fully integrated system that will replace so-called legacy technologies. But the truth is that nobody can have total control and agency. The belief in a scenario of perfect intelligence, which is also ‘actionable’, is an illusion.

Control room of Chernobyl nuclear power plant (Getty)

Martin: What do you think is the greatest challenge of focusing too much on dashboards?

Lucy: There’s nothing intrinsically good or bad about dashboards as devices. It’s the assumptions about what they can do and what they take the place of that are problematic. Dashboards seem to be very much about centralized control and agency. But most issues in society are decentralised and they are about human connection. A dashboard is only going to be valid and useful to the extent that it is integrated into a set of relations and practices that have some connection to the things in the world that are being represented. What I worry about is that dashboards can actually contribute to disconnection, particularly when we rely on the technology without remembering its limits.

U.S. Navy submarine-launched nuclear ballistic missiles system (Lockheed Martin)

Martin: For me a positive example is that dashboards can help us to track, for instance, development spending, giving us some sense of international solidarity and indirectly points to areas of state fragility. So, this is good. When you say limits of dashboards and technology, what do you mean concretely?

Lucy: The dashboard is a multiple object. It’s not a single thing, especially when it is about data collection and analysis. A dashboard can shape how we see things and, if we lack contextual expertise, it might lead us to overly simplified conclusions. As an example, in your field of analytical work for conflict prevention and peacemaking, when technologies become part of the process that might undermine the value of difficult but essential labor- intensive work in the field. It’s not that efficiencies are a bad thing, it’s only if they take the place of thorough investigation and deliberation that they become problematic.

Star Trek Library Computer Access and Retrieval System (LCAR) (Hollywood Relics)

Martin: Understood and point well taken. Anything that comes to your mind where you had ethical concerns about the use of new technologies in the context of war and peace?

Lucy: Well, the idea of the Pentagon’s Project Maven and related projects in the US Defense Department is that instead of having anyone on the ground, you could set up a total surveillance infrastructure. But even if we set up this perfect surveillance infrastructure then we still need to actually figure out what to do with all that data. Maven’s promoters imagine that you have full time video feeds from potential conflict zones, while AI and machine learning come in to process everything. But in my view, this is magical thinking.

Metropolitan Transportation Authority (MTA), Communications-Based Train Control (CBTC), New York City (MTA YouTube)

Martin: What do you think about social media as part of the data wave or our ability to surf the web while making sense of all information available?

Lucy: Twitter definitely is a good example of a double-edged sword. I’m resisting, I don’t tweet and I’m not on Facebook. And I have the luxury of being able to avoid using them and still feel connected. There is clearly an absence of governance in this space, and we have seen governance crises in the real world in response to this. Moving from press releases to tweets makes a lot of sense in some ways, but it also seems really worrying. I am concerned about the process by which information gets validated, and that’s hard with speed. The greater the speed, the less there is opportunity for validation. I think temporality is absolutely crucial with respect to information, and that’s another problem with dashboards and more generally tech fetishism. We often hear an equation of speed with efficiency, but that is frequently a false economy. Because speed generates noise, which then has to be repaired. That idea that speed is necessary is OK, and even a value. But we need caution and care around communication.

Mars mission “Perseverance” launch at Jet Propulsion Laboratory in Pasadena, California (NASA)

Martin: On another issue, an often-raised problem is that technology has discriminatory effects, what are your thoughts on this?

Lucy: One aspect of this is ensuring that the devices that you’re using are actually accessible to those they’re intended to reach, so you don’t introduce a new layer of discrimination while you’re trying to remedy other forms of exclusion. If we are talking about inclusion and communications technologies, it’s really a question of access and familiarity. In your area there is clearly the issue of language and translation. In the case of natural language processing, technology can be an enabling capability that is key for inclusion. I am currently working on an academic article where my co-authors are writing in Spanish, and we use translation software that’s tremendously helpful and opens incredible new opportunities to me for collaboration. But this is part of a wider question of barriers to participation in all parts of life, however much technology becomes increasingly central.

Martin: Thanks so much for this conversation.

Prof. Lucy Suchman works at the intersections of anthropology and the field of feminist science and technology studies, focusing on cultural imaginaries and material practices of technology design. Her current research extends longstanding critical engagement with the fields of artificial intelligence and human-computer interaction to the domain of contemporary militarism. She is concerned with the question of whose bodies are incorporated into military systems, how and with what consequences for social justice and the possibility for a less violent world.

“Futuring Peace” is an online magazine published by the Innovation Cell of the United Nations Department of Political and Peacebuilding Affairs (UN DPPA). We explore cross-cutting approaches to conflict prevention, peacemaking and peacebuilding for a more peaceful future worldwide.

--

--

Martin Waehlisch
Futuring Peace

@UNDPPA #UN #FuturingPeace #ConflictPrevention #PeaceMediation # Diplomacy #Innovation