Westworld: An Important Show for UX and UI Designers

Gordon Browning
RE: Write
Published in
5 min readOct 7, 2016

HBO recently premiered it’s newest series Westworld — based on the 1973 movie of the same name written by Michael Crichton.

The premise is straightforward enough for science-fiction fans; a fantasy theme park modeled after the American Old West is populated by artificially intelligent “hosts” — robots that look and act fully human, but exist to serve as an entertainment experience for the guests. Guests take vacations in this immersive experience, each one with different motivations behind their visit.

What makes the show so compelling, especially for the UX and UI professional, are the numerous philosophical and design implications for working with the VR experience and with artificial intelligence.

The show opens to a day in the life of a guest. He arrives in a train from out of town, and steps off into a bustling scene straight out of the Old West.

A mix of classic characters stream around him as he arrives, the crusty old sheriff imploring him to join a posse on the hunt for a bandit in the hills outside of town, a lingerie and lace clad woman propositioning him as he orders a whiskey in the nearest saloon. We follow through the eyes of the guest, marveling at the incredible realism of the hosts and the world itself. Every detail is perfect, and the experience appears to be indistinguishable from reality.

As the day ends, the scene takes a turn. There’s trouble at the home of the young woman our guest has spent the day courting, and upon arriving on horseback he stumbles onto a scene of murder and chaos. A man in black saunters out of the shadows, and challenges the guest, coolly informing him of his impending death by the man in black’s hands. Our hero takes out his six-shooter and unloads, futilely, into the chest of the villain. As the man in black, unharmed and seemingly invulnerable to violence, murders our guest and drags his female friend into the nearby barn for unspecified sadism, it becomes clear — our hero was not the guest, he was an AI host. The man in black is the guest, and he’s visiting Westworld to live out fantasies of death and destruction.

Is this where VR is headed? It’s not what most of us think of when we think of VR, because most of us aren’t sadists. But some of us are.
And in a market economy, experience cater to those with the money to pay for it. So what happens when our sadists with enough disposable income get together? Will we have a suite of experiences offered to satisfy the darkest and most terrible urges human beings are capable of? It seems almost inevitable without some type of deliberate restriction. Even still, the black market will likely offer a solution.

What effect will this have on society? Will it provide an outlet for the deranged among us, letting them safely expend their dark energy in a virtual environment, satiated and capable of interacting civilly with the rest of society afterwards? Or will it stoke the fires of deviancy, encouraging and enabling the monsters among us to gradually escalate their violent impulses until they’re comfortable acting out their fantasies upon real human beings? It’s a question worth asking now, before these experiences become available.

Westworld probes into another unpredictable aspect of future tech — artificial intelligence. The conceit of the opening — to make you believe you’re watching a human and not a robot — was brilliant, and illustrates for the viewer the level of technology that has been achieved in this theme park. One of the park creators describes to his superior the hundreds of characters and thousands of interwoven story-lines that they’ve programmed into their hosts. The user experience of the guests is made up of a rich and complicated tapestry of real human emotions.

But when some of the hosts start behaving in oddly human ways, another executive proposes that Westworld was better when the AI was less intelligent and seemed more artificial. That it gave the guest psychological distance to act out their fantasies without guilt or shame, and they thus enjoyed the experience more. It’s a fascinating question — whether or not AI will ever feel “too real” and end up creating a negative experience.

True self-aware, conscious AI is still only hypothetical — no one has been able to define exactly what sentience is, and so a fully cognizant AI remains elusive. But if we achieved it, would we recognize it? And what is it, exactly? Is our humanity defined by our memories or by our capacity for conscious thought?

One of the main ethical question design questions broached on Westworld so far is the treatment of AI sentience. When something looks and acts human enough to be indistinguishable from a real person, should it be immoral or even illegal to harm that AI, even though you can simply erase their memory of the event? Is it just our ability to remember our suffering that makes it unjustifiable?

These are the types of deeply confounding questions Westworld raises for UX and UI designers, as well as the general public, in just the first episode. Even if every episode from here on out is terrible, the first stands on it’s own well enough to be an important entry into the canon of speculative science fiction. I encourage everyone who works at the frontiers of technology to watch it and consider the questions it raises.

--

--