Westworld, Season 2 — A Study Guide

We talk about plot, examine continuities, and dissect minutiae. What we don’t do enough, however, is question what Westworld is really about: data-mining, immortality, and the modern Prometheus.

There’s a common stereotype in Television that Season 2 of any show is always worse than Season 1. It’s understandable: a show wants to start off strong, using every punch it has, leaving themselves with less ammunition the second season.

Season 2 of Westworld was not bad. But it wasn’t as good as the first season, nor was it more entertaining. The mystery-box nature of the first season has become more tiresome, like all the obfuscation exists purely for the sake of obfuscation.

There has been one notable improvement, however, to the second season: while plot-points have become increasingly obscure, what the show is really about has become as clear as it’s ever been. Now that Season 2 is in the books, we have an entire year to speculate about where the show is going, so I want to look back and identify what Westworld is really about?

Data, The Most Valuable Currency In The World

From the first moment of Season 1 when a young William and Logan were greeted by Angela’s “Welcome To Westworld”, we assumed the allure of the park was that it provided the guests with a place where they could live and act without consequence, a place where “Fuck, Kill, Marry” doesn’t have to be hypothetical. It was a real life sandbox, like Grand Theft Auto games are for people in our world. (Red Dead Redemption is a more apt example.)

But as Theresa said to Lee in the pilot, “This thing is one thing to the guests, another thing to the shareholders, and something completely different to management.” This season, we quickly learned just what Westworld is to the latter two parties: an artificial environment to gather data on users. As William made very clear to James Delos when he was trying to get him to invest in the park:

“Half of your marketing budget goes to figure out what people want, ’cause they don’t know. But here they’re free. Nobody’s watching, nobody’s judging. At least that’s what we tell them. This is the only place in the world where we get to see people for who they really are.”
(Images via: HBO)

This revelation, revealed in this season’s first episode, came just a few months after the now-infamous Facebook-Cambridge Analytica scandal, and there is, arguably, no better comparison for Westworld than Facebook. Like Facebook, Westworld is a place where those who enter are encouraged to be themselves, to express themselves, and to “connect” with others (and their true selves, in the case of Westworld). Like Facebook, Westworld is sold as a kind of utopia. Like Facebook, Westworld makes most of its money by monetizing the data it gathers on its users. Like Facebook, the data gathered can be quite intrusive.

Digital Immortality

Just when you thought the data-gathering was devious, we got yet another revelation: that data isn’t just being used to make a detailed profile of you to sell to advertisers, it’s being used to make a detailed profile of you to replicate you. We learned in the finale that every single guest who frequented Westworld has been “recorded”, but what remains unclear is whether they signed up for this particular feature of Westworld. Do guests sign up for the promise of immortality and then play around in the park as a way for Delos to gather the data required?

For as long as man has lived, we have had the desire to live forever. We looked to nature and turned to science, but it is technology that now appears to be what will help us reach this end — of ensuring that life doesn’t. There’s cryonics, and cyborgology, human-cloning, and mind-uploading, but while much of this is still hypothetical and theoretical, what is already very real, however, is digital immortality.

(Image via: HBO)

If achieving digital immortality consists of two steps, digitizing people and turning us into a living, digital avatar, then we’re already half-way there. What is social media if not a way to digitize ourselves? Each Google search, Instagram double-tap, tweet, purchase on Amazon, and “like” on Facebook is more than the particular actions themselves. Each of those things, along with almost everything else we do on the Internet, is a datapoint that serves to digitize the things we like, our personalities, and us.

While the episode of Black Mirror that focuses on this depicts a technology that is just short of perfect, Westworld shows us what a fully-realized version of that would look like, with Bernard, the first host confirmed to be based on a real human. It was so real that, for a while, we couldn’t tell the difference. We saw the end-goal in Season 1. In Season 2, what we followed was the process, through William’s project with James Delos.

Creation and the Modern Prometheus

It’s then with this process of creating hosts that can replicate humans, and other such acts of creation, where Prometheus becomes relevant. In Greek Mythology, Prometheus was said to be the titan who created man, and, in his desire to push the boundaries of the life he created, defies the Gods, incurring retribution and his own punishment. The lesson here is that to create sentient life is to challenge the Gods, and that such human striving is an overreach and will bring forth unintended consequences. We often talk of Mary Shelley’s Frankenstein, yet ignore the more important subtitle: The Modern Prometheus.

(Image via: HBO)

Such is the case with Westworld. Ford and Arnold create sentient life and, while seemingly with good intentions, pushes the boundaries too far. “I have come to think of so much of consciousness as a burden, a weight, and we have spared [the hosts] that”, Ford says in Season 1. Ford sees himself as God to the hosts. The sentient life he and the humans created have grown beyond that, and in their ambition, they have overreached and doomed themselves. Ford may have wanted this, but the message to his species is clear: “Pride goeth before a fall.”

“Do you want to know what they’re really celebrating up there? That, darling, is the sound of fools fiddling while the whole fucking species starts to burn. And the funniest fucking part? They lit the match.” — Logan

Moral Ethics and Retribution

The aforementioned fall, then, began when Dolores killed Ford at the end of Season 1, and continued into this season with both Dolores and Maeve, and their respective entourages, claiming control of Westworld. They have gained full consciousness of their past, of all the violence they’ve been subjected to throughout their numerous lives, all the suffering, and horror, and they, Dolores more so than Maeve, are out for retribution.

(Images via: HBO)

This welcomes the real-life question of the ethics of how we treat our technological creations, namely: artificial intelligence. Does becoming a cold-blooded Man-In-Black-like murderer in Grand Theft Auto say something about you, or is that immune to a debate of moral ethics because the characters you kill aren’t real? What about verbally abusing your Amazon Alexa or Google Assistant or Siri? If not, then what about when they’re placed into a human-like avatar? Where is the line? Is there a line?

We will undoubtedly overreach. As the finale made clear, we humans are really not that complex. We’re coded, biologically, and we stick more closely to our loops than hosts do, without even realizing it. Our instinct to survive is a cornerstone of our code. So is, according to Westworld, arrogant overreaching, particularly with technology. As Dolores warns Charlotte, a proxy for all humans, before killing her: “You wanted to live forever. Be careful what you wish for.”

A new narrative has begun. It’s one of retribution, and it’s one that, as Ford hints at in the Season 1 finale, begins with the birth of a new people. The hosts are here, and they’re here to be the authors of their own story.