This post is the summarize from Overcomplicated book by Samuel Arbseman — Chapter Introduction.
On July 8, 2015, United Airlines suffered a computer problem and grounded its planes. That same day, the New York Stock Exchange halted trading when it’s system stopped working properly. The Wall Street Journal’s website went down. People went out of their minds. No one knew what was going on. Twitter was bedlam as people speculated about cyberattacks from such sources as China and Anonymous.
But these events don’t seem to have been the result of a coordinated cyberattack. The culprit appears more likely to have been a lot of buggy software that no one fully grasped. As one security expert stated in response to that day’s events, “These are incredibly complicated systems. There are lots and lots of failure modes that are not thoroughly understood”. This is an understated way of saying that we simply have no idea of the huge number of ways that these incredibly complex technologies can go wrong.
Our technologies, from websites and trading systems to urban infrastructure, scientific models, and even the supply chains and logistics that power large businesses have become hopelessly interconnected and overcomplicated, such that in many cases even those who build and maintain them on daily basis can’t fully understand them any longer.
In the book, The Ingenuity Gap by professor Thomas Homer-Dixon describes a visit he made in 1977 to the particle accelerator in Strasbourg, France.
When he asked one of the scientists affiliated with the facility if there was someone who understood the complexity of the entire machine, he was told that “no one understands this machine completely”. Homer-Dixon recalls feeling discomfort at this answer, and so should we. Since then, particle accelerators, as well as pretty much everything else we build, have only increased in sophistication.
Today’s technological complexity has reached a tipping point. The arrival of the computer has introduced a certain amount of radical novelty to our situation, to use the term of the computer scientist Edsger Dijkstra.
Computer hardware and software are much more complex than anything that came before it, with millions of lines of computer code in a single program and microchips that are engineered down to microscopic scale. As computing has become embedded in everything from our automobiles and our telephones to our financial markets, technological complexity has eclipsed our ability to comprehend it.
In recent years, scientists have begun to recognize the inextricable way that technology and nature have become intertwined. Geologists who study the Earth’s rock layers are asking whether there is enough evidence to formally name our current period the Anthropocene, the Epoch of Humanity.
Formal title or not, the relationship between our human-made systems and the natural world means that each of our actions has even more unexpected ramifications than ever before, rippling not just to every corner of our infrastructure but to every corner of the planet, and sometimes beyond.
The totality of our technology and infrastructure is becoming the equivalent of an enormously complicated vascular system, both physical and digital, that pulls in the Earth’s raw materials and emits roads, skyscrapers, large populations, and chemical effluent. Our technological realm has accelerated the metabolism of the Earth and done so in an extraordinarily complicated dance of materials, even changing the glow of the planet’s surface.
We are of two minds about all this complexity. On the one hand, we built these incredibly complicated systems, and that’s something to be proud of. They might not work as expected all the time, but they are phenomenally intricate edifices.
On the other hand, almost everything we do in the technological realm seems to lead us away from elegance and understandability, and towards impenetrable complexity and unexpectedness.
We already see hints of the endpoint toward which we are hurtling: a world where nearly self-contained technological ecosystems operate outside of human knowledge and understanding.
As a journal article in Scientific Reports in September 2013 put it, there is a complete “new machine ecology beyond human response time”, and this paper was talking only about the financial world. Stock market machines interact with one another in rich ways, essentially as algorithms trading among themselves, with the human on the sidelines.
I’ve noticed, that when faced with such massive complexity, we tend to respond at one of two extremes: either with fear in the face of the unknown or with a reverential and unquestioning approach to technology.
Fear is a natural response, given how often we are confronted with articles on such topics as the threat of killer machines, the dawn of super-intelligent computers with powers far beyond our ken, or the question of whether we can program self-driving cars to avoid hitting jaywalkers. These technologies are so complex that even the expert don’t completely understand them, and they also happen to be quite formidable.
Even if we aren’t afraid of our technological systems, many of us still maintain an attitude of weariness and distaste toward the algorithms and technologies that surround us, particularly when we are confronted with their phenomenal power.
We see this in our responses to the inscrutable recommendation of an Amazon or Netflix, or our annoyance with autocorrect’s foibles. Many of us even rail at the choices an application makes when it tells us the “best” route from one location to another. This phenomenon of “algorithm aversion” hints at a sentiment many of us share, which appears to be a lower-intensity version of technological fear.
On the other hand, some of us veer to the opposite extreme: an undue veneration of our technology. When something is so complicated that its behavior feels magical, we end up resorting to the terminology and solemnity of religion. When we delight at Google’s brain and its anticipation of our needs and queries, when we delicately care the newest Apple gadget, or when we visit a massive data center and it stirs something in the heart similar to stepping into a cathedral, we are trending toward this reference.
However, neither of these responses whether from experts or laypeople is good or productive. One leaves us with crippling fear and the other with worshipful awe of systems that are far from meriting unquestioning wonder. Both prevent us from confronting our technology as they are.
This orientation will require us to meet our technologies halfway by cultivating comfort with these systems despite never completely understanding them.
This is the sort of humble comfort that dwells in ambiguity and imperfection, yet constantly strives to understand more, bit by bit. As we will see, this orientation involves, among other things, each of us thinking the way scientists do when examining the massive complexity of biology.
Despite all the overcomplication of the systems we vitally depend on, I’m ultimately hopeful that humanity can handle what we have built.