Strategic Blindness: When Aliens Attack Part II

Adam Elkus
Rethinking Security
7 min readOct 12, 2015
It probably doesn’t want to take a selfie with you, that’s for sure

I have always been inspired by Sam Liles’ post on looking at the cyber-blindness of typical strategic theories and authorities from the standpoint of a notional scenario involving aliens. That post was memorably titled “Strategic Blindness: When Aliens Attack.” So I will make my own contribution to the genre. Albeit with the alien references I like the most (including the source of the header image).

Strategic Blindness: When Aliens Attack

I have not gone sci-fi merely to show off my sci-fi movie (or more accurately, anime DVD) collection. Rather, I want to point out a basic problem inherent in strategic thinking. We are faced with a problem arising in the contradiction between 1–2:

  1. The enemy is like us to a certain extent; we both have certain things in common. Perhaps we may, despite our own pride, be creatures describable by the same underlying logic and underlying purpose. Even when the enemy is a complete mystery to us, we may infer something about their behavior simply from observation.
  2. The enemy is not like us; what matters is not commonalities but differences. We have a crude, at best, idea of what the enemy will do despite knowing little to nothing about the process that forged the opponent. Our inferences about their future behavior are clumsy, haphazard, and may be totally wrong. Similarities may serve only to more deeply confuse and mislead.

Strategic blindness stems from the most fundamental difficulty a strategist can face: the unknowable and inacessible mind of the opponent. When the strategist uses stereotypes and exemplars to aggegate a composite entity (Russia into “Putin”), this difficulty is compounded. But let’s go beyond generalities and take a look at some specifics.

Imagine that you are Ellen Ripley, the heroine of science fiction horror thriller Alien. You are struggling to survive while an “xenomorph” (a term used to refer to the titular aliens of the series) chases you through a ship in deep space. Your co-workers on that ship are dead or will soon be dead. You can only survive by predicting the behavior of the xenomorph. But how could you? Both terrifying and majestic, it seems beyond understanding. Forged by an evolutionary process dramatically different than the one that created you, it nonetheless shares with you a desire and capability for reproduction. You also know that it seeks to destroy you — either because you got in the way of its plan or as a function of its plan (which uses you as an unwilling surrogate for its child). But beyond these basic facts, the creature is a black box.

Next, imagine you are one of the countless sci-fi heroes faced with combating an group of xenomorphs with a chain of command, strategic and tactical behaviors, and a hive-mind. This now cliche setup began with Starship Troopers and has been continued by Ender’s Game and the Starcraft video game series (and to a lesser extent, the shitty Gundam 00 movie). Here you face a cruel extension of Ripley’s problem. You are not just fighting a single xenomorph or even a single xenomorph bug “queen.” You are fighting a sophisticated civilization that you must also treat as a complete black box.

Now, let us take a different science fiction franchise: the Japanese anime Neon Genesis Evangelion. You are Shinji Ikari, forced to pilot a giant robot against your will to fight off a similar crew of inscrutable aliens. Why are they attacking us? But the killer aliens are only the start of your trouble. You are painfully shy and cannot understand or relate to others. You seem to understand some basic things about them: they have roles and duties like you (piloting giant robots or acting as rear echelon staff to support and direct the giant robot-ing, going to school in your class, etc). They seem, like you, to be plagued with the same basic problem of needing to be close to other people but while dealing with the unavoidable consequence of opening themselves up to be hurt.

You later learn that the aliens you are fighting share 99.89% of your DNA and might be regarded as “humans who cast aside human form.” You place an enormous amount of trust in a new friend only to learn that this friend is actually an alien in disguise. You end up making a desperate last stand not against these aliens but other humans that seek to kill you in service of their own fear and greed. And on top of it, the very giant robot you pilot (and the enemy robots you are fighting) has been reverse-engineered from the very first alien that attacked the Earth. Kind of crazy, huh?

Strategic Blindness: When You Came To Use The Minimax Algorithm and Chew Bubblegum and You’re All Out of Bubblegum

You will never completely understand the opponent, but you must utilize some default theory of what makes it tick and what drives it simply in order to fight it. This default theory by necessity is crude. Ripley’s alien wants to reproduce itself. Therefore it will hunt down humans and entomb them prior to their violent death via “pregnancy” and “birth.” Good enough? OK, you still know nothing about the evolutionary process that produced this alien and the dramatic tension in Alien comes from many of Ripley’s colleagues being horrifically wrong about how to best attack it.

In Starcraft, the United Earth Directorate’s expeditionary force — sent to contain unruly aliens — is totally annihilated as a consequence of strategic blindness, and its commander Gerard Dugalle is marvelously frank about why he failed (if you are interested about the specifics, see the summary in the linked wiki):

Dearest Helena, — By now, the news of our defeat has reached the Earth. — The creatures we were sent here to tame are un-tamable. — and the colonies we were sent to reclaim have proven to be stronger than we anticipated. — Whatever you may hear about what has happened out here, know this; — Alexei did not die gloriously in battle. — I killed him. My pride killed him. — And now my pride has consumed me as well. — You will never see me again, Helena. — Tell our children that I love them, — and that their father died in defense of their future. — Au revoir.

Note that it is Dugalle’s pride that kills him, his subordinates, and virtually his entire command. He attempted to tame the un-tameable; the colonies he was sent to reclaim turned out to be far more robust than he had anticipated. Other series have less direct and harmful but nonetheless devastating consequences for strategic blindness. In blurring the lines between human and alien, Evangelion unhinges both the identity of the protagonist and what the protagonist believes about the identity of the opponent. It reduces him to a blubbering wreck that can barely muster the will to fight. Kind of vaguely Boydian, eh?

At some basic level we have the disguised minimax algorithm assumption inherent in even non-quantitative strategic theory; the enemy will do whatever it can to thwart us and achieve its own goals (whatever they may be). But strategic blindness is simply taking this nub of an explanation as satisfactory. When aliens attack — or enemies that seem as foreign to you as aliens — you have to make basic assumptions about the “why” behind their mechanisms for killing your men and taking your cities. When you observe them attacking your base, can you say “they are attacking this way because it’s adapted to the need for attacking bases?”

That’s a bold statement. Even if they are aliens. The enemy wouldn’t choose an option that would be obviously wrong, would it? Perhaps they might, under certain circumstances. Or perhaps they may have a strategic calculus that simply was beyond your imagination because it was beyond the limits of your narrow worldly experience. Either way, the more you confuse your model of the world for reality without bothering to perform some sanity checks, the less rational you will be. Strategic blindness is only a problem if you don’t ignore it outright. As Maj. Ben Zwiebelson’s dispatch from wargaming and training and other environs seems to indicate, we would prefer to ignore it — which in practice means using ourselves to simulate the opponent “aliens.”

Conclusion

Science fiction is often a hinderance to correct decisionmaking but here it may assist us. In Liles’ original post, aliens expose the artificiality of the US military’s domain construct. There are really, as Liles argues, two domains. The domain of conflict and the seam that holds that domain together. Cyber operations target the latter to hit the former, exploiting the strategic blindness of the titular decisionmaker facing the “alien” onslaught. Whomp whomp whomp. Similarly, here the aliens exploit a different kind of strategic blindness — the blindness of a strategist that lacks the underlying imagination to put himself or herself in the place of the adversary despite the numerous obstacles blocking the way. You don’t have to be fighting an alien invasion to see why that is a bad idea, despite the ultimate impossibility of ever really knowing the Other.

--

--

Adam Elkus
Rethinking Security

PhD student in Computational Social Science. Fellow at New America Foundation (all content my own). Strategy, simulation, agents. Aspiring cyborg scientist.