Broken Worlds

After arriving in Los Angeles for E3’s annual industry extravaganza, I walked around a lot and wondered what the hell is happening to videogames.

“Steve. Steve. You’re actually here, so you know what that means. I’ve gotta show you Bioshock.

Hallelujah.

I’m in Portland, sitting on a couch in my friend’s basement apartment. It’s been a week since I left Los Angeles following the close of June’s Electronic Entertainment Expo, the biggest professional videogame trade show in the country and one that I’ve grown increasingly weary of covering over the past four and a half years. I sigh, rolling my eyes. I haven’t been home to Seattle now in almost two weeks. I’m a little bit drunk and I know there’s not really going to be any convincing my overzealous pal out of what’s about to happen. He takes my non-response as a sign to proceed, grabs Bioshock Infinite from the shelf, throws it in his 360 and plops himself down on the couch next to me. I reluctantly take the controller.

I’ve been avoiding the newest Bioshock, more or less, since a couple months before its release this past March. The way the videogame industry works is that once a huge, mega-hyped release is just over the horizon, the frenzy of anticipation becomes like a black hole – regardless of personal involvement, the only way to escape some type of exposure is to get offline and turn off your phone.

After Booker DeWitt climbs a mysterious lighthouse and is rocketed tens of thousands of feet into the air to the floating metropolis of Columbia, I am reminded that, in theory, I don’t really have anything against Infinite. In a not-unexpected display of graphical prowess, my introduction to the city fills me with a mix of confusion and wonder. Perhaps Booker feels the same as his metal cage shoots past zeppelins shining in the high sun and levitating structures that loom with imposing presence and ornate architecture. The theme park world feels like something Walt Disney would have created in an alternate timeline of history, if the man had been an unhinged sociopath with plans for unprecedented cultural eugenics.

When the ride slows to a stop, I’m deposited into a sunlit alcove with a central mural and a backdrop of soft choral tones. I already know what awaits. Infinite’s baptismal ceremony has been the subject of much debate amongst the press, though given the game’s overt religious zealotry the inclusion is neither surprising nor offensive to me. In the brief moments before approaching the baptismal chamber, a long candle-lined path in ankle-deep water for holy induction, I’m nevertheless struck by the weight and implication of the notion of the words on the stained glass in front of Booker and me: AND THE PROPHET SHALL LEAD THE PEOPLE TO THE NEW EDEN.

A profound dichotomy.

As I step forward into alcove, a twinge of sadness hits me along with a familiar mechanical sound: an objective marker appears on the screen commanding me to FIND A WAY INTO THE CITY. The overly game-y intrusion makes me want to put down the controller. I am suddenly out of the moment, any emotion I was supposed to be feeling at the scene snatched away with the appearance of a cloying objective hint. I have a fleeting, vaguely blissful notion of quitting the game industry.

I begin walking the path of Columbia’s righteous. “Path” is the best descriptor here, as Infinite’s narrative claws are dragging me down this straight and narrow line where I weave back and forth, trying to jump over the candles on either side of me.

“You have to go forward, Steve,” my friend says. He is clearly annoyed, having to voice a design parameter we both already knew was indisputable. When the priest dunks Booker’s head and we have accepted the new prophet, I take my first steps into the suspended city with no particular enthusiasm.

Later, confronted by Columbia’s casually übermenschish patriotism, I shake my head. “See, I like the ideas here,” I tell my friend. “The themes and the art direction are really interesting, but the problem is that I know it just turns into a big, stupid shooter.”

“And that’s a totally valid complaint,” my friend says, agreeing with me for once. “It does turn into a big, stupid shooter.”


E3 2013, my fifth, was unexpectedly weird. It ended in mid-June and I’m still coming to grips with what it means. It wasn’t that E3 itself was so strange, it’s the whole industry. Even the genuine excitement that bubbled up following Sony’s PS4 blowout announcements felt like an oddity, a rare moment of enthusiasm that spilled over and fizzled out too quickly, like bad champagne. The consumerist mess that would follow over the next three days soon drowned it out.

Microsoft’s presser had two shark-like McLaren MP4-12Cs circling the Galen Center. Videogames: 2013.

It had been Sony’s moment to seize. Microsoft’s hour-long press conference had been concerned with assuaging the fears of hardcore Xbox 360 fans out for blood over the Xbox One’s staunch DRM measures. Before arriving at Los Angeles’ Galen Center for the MS presser that morning, the first thing I noticed from the cab window were two brightly colored McLaren MP4-12Cs advertising Forza Motorsport 5. May’s Xbox debut event in Redmond was all bros, TV and testosterone, so it didn’t come off as much of a surprise to be greeted by a couple hypercars casing the building. I remembered the disastrous Cirque Du Soleil Kinect reveal a few years back where journalists were made to wear brightly colored electric ponchos. I was glad I wasn’t there. With Sony’s conference still hours away, it seemed like 2013 was so far shaping up to be E3 as usual.

I don’t usually attend press conference day. Whatever the console makers and third-party heavy hitters have to say to a live audience is simultaneously broadcast as a streaming video, so you can just as easily watch the conferences from the comfort of a hotel room or a barstool. The scant news that comes out these things – often sandwiched between lame scripted banter that makes the Triple-A videogame industry strike depressingly close to bad Hollywood stereotypes – always hits immediately anyway.

At Sony’s conference, I snuck into a restricted “bloggers only” area to get a better seat – not that anyone was actually monitoring the handful of wifi-enabled tables. The setup was par for the course, the stage surrounded by an arcing array of television screens looping a sizzle reel of PS4 launch titles, the speakers pumping hit singles designed for mainstream appeal. After the Xbox disaster that morning, what reason did I really have to be here? I possibly just wanted to see firsthand what sort of counter Sony would have in the works for their Redmond competition. Or maybe I wanted to see a train wreck.

Then Sony Computer Entertainment America president Jack Tretton announced, not without a certain degree of smug (and deserved) satisfaction, that the PS4 wouldn’t employ any DRM or “always online” policies. The concept of digital ownership (or whether or not a purchased physical game disc is actually your property) had, up to that point, been coming under heavy fire. DRM and used game countermeasures, what the Xbox One was touting, would limit players from trading physical games with friends, a move that would make any next-gen console useless if not tethered to a constant high-speed internet connection. When the uproar of the crowd issued forth from the mass of journalists and professionals gathered in the Los Angeles Memorial Area, it really seemed worth being there, fake live-blogging the event. For a brief second, it almost felt like E3 might, for once, be worthwhile. Videogames might be OK.


The show floor of E3 at the LA Convention Center is something you gradually come to ignore. The two halls where most of the industry’s big budget, Triple-A muscle congregate are cacophonous cathedrals of colored light, deafening orchestrations of machine gun fire and gooshy carpet you could easily fall asleep on. This is a place devoted to the worship of the annual Call of Duty, Madden and Assassin’s Creed installments by way of the elaborate iconography of multi-million dollar booth displays – mechs and marines and monsters and scantily clad models calling out, appealing to one’s specific spiritual devotion. The throngs at mass: gawking fanboys, Gamestop stooges, camera crews and industry executives effortlessly all wearing the same white-or-light-blue starched shirt and dark-gray-or-navy-blazer combo. In many ways, videogames are a uniquely Californian industry and, if you don’t live in development hotbeds like San Francisco or Los Angeles, the glitz promotion feels a little less every-day. A theme park for yearly pilgrimage. We are come to worship at the altar and preach the gospel of videogames. And the prophet shall lead the people.

Attending E3 as a reporter, as much as the word applies when covering anything in the entertainment business, there is no time to be exhausted by spectacle or the endless mass of people. The crowds gathered to watch the multiplayer footage of Shooter X are obstacles. You view the path in front of you in dynamically changing angles, a series of real-time directions carved out towards the next appointment: cut through the left of this slow-moving group, shift right, twist your shoulder, thread across the comparative empty space of the booth next to you, lift up your messenger bag and adjust your weight slightly to avoid the onset of oncoming traffic, taking in the passing sights only as needed to find the quickest path to your next destination among the hubbub.

Capcom paid models to play zombies in their Dead Rising 3 chain-link undead pen for the duration of E3. At least I assume they were paid.

Briskly moving past the display for Capcom’s Dead Rising 3, a zombie cage populated with actors paid to shamble around and ravenously moan at passerby for the duration of the three-day show, I briefly entertained the thought of sticking my hand through one of the holes in the chain link at the undead girl licking her chops in my direction. How far would this LA performer go with her character motivations? Just like the location of each major publisher’s booth on the floor, the action year to year never really changes.


It is easy to feel the increasing drudge of E3 as the number of shows you cover begin to mount. The conference highlights the most bloodsucking aspect of the videogame industry while ignoring the fascinating designs and people and ideas that are pushing the medium forward. Games are less a medium here than a money-grubbing bottom-line.

Change has been gradual. The unveiling of a slew of new indie games for the next PlayStation was not a surprise once Sony announced a commitment to innovative, smaller-scale downloadable titles at the Game Developers Conference in March. Yet the reassurance that the PS4’s first year would see a number of less-mainstream titles from a company also responsible for publishing games like Killzone – from whose name alone you can probably guess the depths of its imagination – was nevertheless refreshing.

World building designs like Project Spark are a nice idea, but just leaving creativity up to your audience doesn’t help the industry evolve much.

Sony’s booth was populated with creative-minded titles, something the industry desperately needs at its heavily-advertised forefront where “build your own game” sandboxes like LittleBigPlanet and the Xbox One’s similar Project Spark aren’t enough. Other than PS4’s efforts and Indiecade’s tiny oasis in the corner of South Hall, walking around the show floor might not give you any indication games could be anything other than shooting terrorists and players teabagging each other. E3 is a truncheon to progress.


For the self-serving bottom line, 2013 was – and in most ways should have been – a banner year for the Electronic Entertainment Expo. Consumers want to know about next-gen hardware. They want to know what they’ll be playing when they get a PS4 or Xbox One, to see what these allegedly new game experiences are capable of.

I’m interested in next-gen tech, too, though my curiosity involves pushing it beyond the surface level titillation of multi-million dollar videogames that look one step removed from CG. What might developers do with that power? The potential implications next-gen could have on artificial intelligence, for example, are enormous.

AI has always been something of a means to an end. It’s rare to hear PR reps talk about revolutionary steps forward with its implementation, aside from simply highlighting how this new product is a ruler’s length smarter or more adaptable than whatever it was in the last title the developers worked on. After a while, videogames become a series of systems laid bare for any observant player to see: a network of if/thens, branching paths and executable decisions based on user input. It has been a long time since the grunts in 1998’s original Half-Life worked together to outwit and outflank you, a frightening prospect back then that revolutionized the sophistication capable from first-person shooters.

I wouldn’t have expected Microsoft to be the one company seeming to push AI advancement. They took a lot of shit over their seemingly ironclad policy on a required continuous internet connection (“Fortunately, we have a have a product for people who aren’t able to get some form of connectivity,” smirked then-Microsoft president of interactive entertainment business Don Mattrick in an interview during the show. “It’s called the Xbox 360.”), the cloud usage of which could be used in any number of ways. Surprisingly, Forza 5 seemed to be leading the pack for AI by allocating just that.

Forza 5's Driveatar system races for you when you’re not around, as though alive. Maybe.

Rather than resorting to typical AI, which in racing games usually means a sliding scale of aggressive-to-unfair behavior from rival computer-controlled drivers, Forza 5 uses the ridiculously named “Driveatar” system, which replaces AI with actual data from other players connected through a cloud server. Every player in Forza has a Driveatar that continues to drive around (whether you’re actively playing the game at the moment or not) gathering data in your absence. Instead of driving against computer opponents when you take control, you’re driving against other Driveatars. Though I’m not totally sold on Forza 5 as the “end of AI” as MS handily touted it, you take these sorts of things with a grain of salt. It’s certainly an interesting prospect.

Or, it might have been. In the weeks following E3, Microsoft reversed nearly all contentious policies for the Xbox One – no more draconian restrictions on used games that would limit the number of friends you could lend a disc to (or the number of times a game could be so lent), no publishers determining how many consoles a game could run on, no activation fees if someone wanted to play a used game. Rules for reselling used games are no longer in the hands of publishers, either, and you don’t have to be on someone’s Xbox Live friends list in order to borrow a game from them. In addition, with the always-online connection and 24-hour online check-in requirements abolished, plans developers may have had to utilize the Xbox One’s cloud storage to free up memory for weird experimentations like Forza 5’s Driveatar may no longer be applicable.

The degree to which MS is now backpedaling away from their crazier notions for the cloud is anyone’s guess, but considering the way major videogames are viewed only as a business these days, I wouldn’t put much stock in a robust plan for online innovation.

In general, AI wasn’t a major concern for companies with next-gen titles on display. To be fair, it’s hard to judge what you see in an E3 presentation as totally representative of what a game may actually end up being. These are carefully controlled exhibitions, sliced out to give you a general impression of whatever they’re showing off, an information drip that lasts from one conference or media event to the next. It takes any developer awhile – as in, a number of years, usually – to learn and gain control over the programming parameters of new hardware. But I’d be lying if I said it wasn’t a little disappointing to see the future of videogames as depicted solely by sexier graphics. Shooters now had newly intense particle effects but stealth games still sported rudimentary sentry paths. Not exactly a vast technological sea change.

Admittedly I will probably play Battlefield 4 when it releases, because everyone who plays games likes indulging in the shallow joy of sexier graphics every once in a great while. I doubt I’ll pick up another military shooter, though, until someone makes the videogame equivalent of War, Sebastian Junger’s sobering account of 15 months spent embedded with a platoon of American troops in Afghanistan’s hostile Korengal valley. Maybe the Call of Duty dog, as wonderful as its existence may be, is the signal that the modern military genre has jumped the shark and that it’s time to find some new trendy bandwagon to jump on and eventually get sick of.

No game has ever captured what civilians like most of us can only read and speculate on what are the true horrors – and the maddening tedium – of war.

Already it seems that trend may be the idea of the open-world, with go-anywhere freedom quickly in danger of becoming the go-to colloquialism for so-called next-gen design. I guess if improved AI just equates to smarter soldiers that flank you that much harder, it may not be such a big loss anyway.

Either way maybe we should try moving off the battlefield, at least until someone’s ready to bring back Warco or Six Days In Fallujah.


“Write whatever you want.”

That’s my editor, Medium’s own Stu Horvath, telling me what he’s looking for before the show has begun. Stu took ill and had to cancel his trip to the West coast and, as a result, I’ve sniped nearly all of his appointments. This puts me in the enviable position of being a freelance writer with nowhere to go but where I want to go. In an assignment driven business, this is almost a vacation.

In between appointments, I sit in the pressroom in West Hall and think about what E3 actually is this year versus what I thought its potential might be. This year is weird because E3 has become irrevocably weird. In the days before live streamed announcements and smartphones and 24 hour news cycles that have marginalized the impact of all news announcements, E3 was the time where everyone pulled out all the stops to really try to surprise fans and competition alike.

When then-SCEA president Kaz Hirai unveiled the PlayStation 3 in 2005, the insane price point of “Five hundred and ninety-nine US dollars!” was certainly a shock. This year, Xbox One’s price tag was revealed at $499 – $100 more than the PS4 – and it almost felt like a side note, quickly forgotten in an hour-long barrage of videogame trailers and presentations.

Those economics aren’t off, at least not in terms of expectation (though the backlash and derision aimed at a $500 Xbox was palpable). The industry’s growth throughout this generation has not always been for the better. Money has twisted the top-heavy upper-echelons toward loud incoherence, like a well-used San Fernando porn star going through familiar motions because they reliably pay out. Unlike past E3s, few titles among 2013’s original Triple-A crop had much originality to them.

Killer is Dead captures Suda 51's high-flourish auteurism in often-blindingly chaotic displays of chromatic violence.

I did play a handful of fun current-gen games. The next 3D incarnation of Castlevania was pleasantly almost exactly how I pictured a sequel to the underrated Lords of Shadow; Dark Souls II was crueler and more dynamic than its predecessor, with impressively improved visuals; Suda 51’s Killer is Dead delivered an engaging aesthetic experiment in violence and color – like a more graphic El Shaddai. In motion, it reminds me of cherry blossoms: the frenzied swordplay of your cyborg executioner avatar surrounded by a near constant bouquet of curdled plumes of pink and purple and red that shift and unfurl so thick and fast with each blade swipe it’s hard to keep track of the action on screen.

A handful of non-playable presentations also proved entertaining, though what I enjoyed at the show failed to move me much. The problem was nothing stuck with me. I didn’t even make any real appearances at the nightly open-bar E3 party scene that publishers and hardware companies put on for show attendees. I grasped at straws countless times in front of my computer, Word’s cursor blankly blinking at me.


I ended up frequently walking around downtown LA. In the absence of parties I waited until the pressroom closed, usually grabbing dinner somewhere nearby with a handful of friends that seemed to change every night. We sat around tables cluttered with the forgettable culinary options of LA Live, the soulless commercial drag encompassing the Staples Center just up the Figueroa promenade from the convention center, seeking out opinions on the show. At least we could all agree E3 felt off. A momentary solace.

I don’t like Los Angeles. Like all big cities, it’s dirty (I’ve noticed downtown often smells like piss) and in every part I’ve visited, it seems dirtier than the last. The long city blocks seem to stretch the distance between you and the skyscrapers that sit unmoving in the polluted air, projecting the illusion that you’re so small that even when at a constant, brisk walking speed you never actually go anywhere. No matter how close you’re staying to the LACC – with the possible exception of the Figueroa Hotel just a few blocks north – you’re never close enough to not immediately want to jump into a cab.

Not that it’s a walking city anyway. The city’s 500 square mile sprawl stretches to the proverbial horizon. If your whole existence there is, as mine has been, limited to the small Southwestern corner of downtown encompassing Figueroa, its numbered streets branching East and the occasional trip West to Hollywood, your experience is probably not ideal. During the day the heat makes getting anywhere on foot feel oppressive. The combination is suffocating.

Pershing Square, a few blocks from the LA Central Library. The gross yellow light isn’t really a trick of the camera.

Los Angeles at night is its own beast. Figueroa is bizarre enough, since for all the storefront restaurants and shops, the only life seems contained within its multi-lane traffic patterns. You pass people on the street but no one seems to be heading towards a destination. Those already inside the shops and bars seem only to exist within them, entire lives spent as living, immobile puppets. Even here, you can’t outrun the long marketing arm of videogames, as giant ads plaster the sides of hotels and high rises, making sure you’re well aware of what first-person shooter you have to go check out in South Hall.

Several times people have reflected to me how alien the cityscape is at night. Alienation has driven the Los Angeles narrative in innumerable forms since the postwar era and it’s hard to picture the town being anything but as mottled and filthy as it is today. Downtown doesn’t seem to be populated by actual Angelinos – I’m positive no one actually hangs out there. For the week I’m there, I’m one of thousands of out-of-towners damned to walk streets that harbor a chill that almost borders on menace. The feeling of being an outsider never dissipates.

Los Angeles in all its glory.

Every night on my zigzag trek east from Figueroa to the loft where I’m staying, I walk past the famed Los Angeles Central Library, a landmark I know from L.A. Noire, now dwarfed by the columns of buildings that tower around it on the streets adjacent. It’s convenient to romanticize the isolation of the city at night, as Michael Mann so expertly did in his dingy, grainy Collateral. The most romantic thing I see in my expeditions past the library is a greasy three-inch cockroach skittering soundlessly from the blackened sidewalk into a drainage ditch.


I bluffed my way into see Deadly Premonition director Swery65’s new Xbox One title D4 on the last day, catching the second half of a 30-minute presentation that was split with an Xbox Live Arcade game so forgettable I didn’t bother writing its name down. D4 looks every bit as strange and wacky as Deadly Premonition’s Twin Peaks sensibilities would suggest: the plot involves a private detective who can go back in time and is trying to solve his wife’s murder, who is also evidently cognizant that his manipulation of the past can’t actually change the inevitable outcome of her death.

After the presentation, an outlandish 15 minutes of laugh-out-loud character interaction and weird Kinect interfacing, I asked Swery – impeccably dressed in a goofy Nipponese mod style that included designer frames, a yacht jacket, a fashionable Summer scarf and leather shoes that probably cost around 400,000 yen – how he came up with D4’s madcap premise. His answer was short and to the point.

“I have no idea,” he said to me in strong English, not without a hint of irony. “My life.”

D4's debut trailer at the hour-long Microsoft press conference was given less than a minute screen time.

Out of all the Xbox One games Microsoft was scheduling appointments for at their booth, I’d guess less than one percent of attendees elected to check out D4.

Swery, like Suda and several other international developers, are in many ways what make the industry bearable. The profound and subtle differences evident in games hailing from outside North America often make for more engaging premises, narratives and approaches to design. And while I can’t dispute that there’s some truth that some Japanese developers simply aren’t willing to take many financial risks, there seems to be more willingness to embrace auteurism in their work, a dichotomy I can only blame on a contrast in pop-cultural interaction.

The Evil Within, Shinji Mikami’s return to Resident Evil 4-esque survival horror, is a clear message from the man who popularized the genre (Alone in the Dark predated Resident Evil in concept by a number of years, but it failed to resonate much with a large audience). In the wake of high profile disasters like RE6 and Dead Space 3, it seems clear that the majority of the industry doesn’t know how deal with horror, action-based, psychological or otherwise.

The gameplay I see in Evil Within, a PI trying to escape a hellishly graphic asylum strewn with enough gore and monstrous imagery to fill a dozen grindhouse films, isn’t especially scary but neither is playing Fatal Frame with multiple friends in a room, an activity I unsucessfully attempted as a teenager. Mikami’s decision to break the popular horror paradigm is no less piquant, especially when measured up against the Wolfenstein sequel The New Order, paired with Evil Within’s presentation.

Shinji Mikami’s The Evil Within is a clear statement on survival horror’s more incoherent modern offerings.

Wolfenstein has been getting more buzz due to its breakout adventure game narrative moments (Evil Within has been deemed “too old-school”) though the Swedish-developed game feels much more Western than it could be. I have a hard time staying in the moment whenever the Aryan-countenanced Nazi killer B.J. Blazkowicz breaks the game’s supposed moody narrative intelligence by mouthing off incongruous one-liners.

Konami wasn’t showing any more of Metal Gear Solid V at their booth than Hideo Kojima’s grim, breathtaking director’s cut trailer. One night at the loft, I parsed through it to piece together the meaning of the many red herrings and meta-misdirection Metal Gear’s ingenious director had already pulled, not the least of which was disguising the game’s reveal as another title from an unknown European developer during Spike’s VGA broadcast last December. Kojima’s everlasting imprint on the industry, that of a creative mastermind of the medium that never fails to surprise and push forward videogames as a whole, made my analytical process more rewarding than many of my experiences on the show floor had been.

The Wild Hunt, the third installment in the Witcher series and a massive hunter-and-hart fantasy tale that looks to outsize Skyrim by a considerable number of leagues, is about as close to an interactive Game of Thrones as we’re likely to ever get, outclassing any new RPG developed in the United States or Canada. The Witcher is its own series of novels by Polish writer Andrzej Sapkowski, probably a telling reason why the developers at CD Projekt Red have been able to adapt it into such a literary-feeling series: rich with politics, character interactions and a maturity that reflect deeper narrative concerns that makes Elder Scrolls look skeletal. The Wild Hunt was running on an actual PS4 debug kit able to render stunning, seamless landscapes with dynamic weather – not the most exciting use of all that tech, but through CD Projekt’s tour of the ordinary and in seeing such a realized and sophisticated and European world, I found something genuine.

Bayonetta 2 and Dragon’s Crown, a pair of wildly dissimilar Japanese brawlers, drew considerable crowds, presumably by not being stuffed into non-descript meeting rooms despite their niche appeal. Both games looked great: Bayonetta with its endless kinetic energy, Dragon’s Crown a playable HD exhibition for the painterly Western-free brushstrokes of Vanillaware’s George Kamitani. If you have any institutional memory for modern Japanese game design, you don’t really need hands-on time with these to understand them.

I couldn’t help but wonder if, while watching someone button mash their way through the Bayonetta demo, half the people lined up to play were distracted first by the incentive allure of free related swag or pictures with booth babes. In a head-to-head sales battle against high-profile Western releases, the Bayonettas and Dragon’s Crowns are just about always the trailing second place finish. As the demo player struggled with Bayonetta’s controls, I noticed the model Nintendo had hired to cosplay as Bayonetta posing nearby. I hated that I had been around long enough to observe she was far too diminutive to portray the character as the girl at Sega’s booth promoting the original game had across the hall years earlier.


It’s around 8:30 in the morning and I’m on speakerphone. On the coffee table is a set of questions I’m working through with a high-profile European developer overseas – arguably one of the most ambitious designers working today. We have similar opinions on the potential of the medium versus the philosophical space they occupy on a commercial level. As the conversation winds on, he eventually gets to the heart of why he feels the industry needs to change.

“That same old gameplay no longer interests me,” he says.

I’m some hours into the tense, nightmarish post-apocalypse of The Last of Us and it’s the first game I’ve touched since finally returning home. I’ve always admired Naughty Dog for their narrative chops and technical wizardry. I’m looking forward to seeing what progresses in the desperate, brutal world Joel and Ellie inhabit – though their story is probably the last interactive pandemic narrative I’ll feel the need to digest for the prolonged future. There are a handful of other games I’ll happily play in the coming months, for stylistic reasons or more substantive ones: Killer Is Dead, Beyond, maybe a couple PS4 launch titles and the meditative and moody indie sci-fi experiments Routine and The Swapper, if I can set aside the time to get them to run on my MacBook. I’m only interested anymore in games that make me think and challenge my brain. Maybe it’s why lately I’ve been sticking to novels.

If there was ever a time to step up and evolve, with more Journeys and Papo & Yos and less fictions where every single scenario involves you picking up a gun and murdering your way through thousands of people, it’s now.

Maybe a better use of my time.

I don’t hate the game industry, even if its machinations, so apparent in the reflection of E3 and the town that’s long borne it, are often something I find increasingly frustrating and problematic. Nor have I given up on videogames as a medium for creative expression. In fact, I’ve had more interviews talking with smart, fascinating, inspiring developers than I can count. It’s just that traits the industry deems important seem silly. We’re not allowed to deviate from that candle-lit path. Hell, PR treats breaking embargoes – the binding agreements that prevent anyone from reporting on anything from those carefully drip-fed events until a certain date – as a form of treason. All this over pieces of entertainment software. As I wrap up my interview, it’s nice to talk with someone who shares some of my ideas. Maybe I just need a break from trade shows.

But, for the most part, I think I’m ready to put down the controller and let all the acts of violence in all the big, stupid Columbias in the digital world sort themselves out without me.

Next Story — Becoming Human
Currently Reading - Becoming Human

Becoming Human

CryEngine’s latest tech makes great strides toward ascending the uncanny valley


Exponential growth in technology is making it harder and harder to convince ourselves we’re not living in a piece of everyday science fiction. Between DARPA contractually collaborating on faunal field operators and smartphone AI developing fast enough to outpace our own neural networks in a few short years, the future seems as prevalent as it is striking.

So naturally conversations about the uncanny valley, that “off” feeling you get when encountering something artificial that’s mimicking a human without quite pulling it off, have become increasingly commonplace. And like so many other complex ventures into new territory, the crucial difference between ascending beyond the confines of that wide gulf — or falling just short — is a matter of critical nuance.

As a next-generation game designed to take full advantage of the power of Microsoft’s Xbox One, the vividly barbarous legionary tale Ryse: Son of Rome is as innocuous as a gladius to the gut, even as the tech behind it employs countless seductive technical subtleties. If nothing else, the wizardry of developer Crytek’s proprietary CryEngine is a sight to behold: Ryse’s opening setpiece rages through the haze and cinder of battle as Rome burns and the brushed metallic finish of an army of centurions reflects dully against a late afternoon sun.

CryEngine’s technical wizardry brings everything from legit cinematography to realistic lighting to life in near-CG quality.

Though some aspects of the user interface are necessarily immersion-breaking, filmic touches like camera shake, depth-of-field photography and motion blur can make it hard to differentiate between Ryse and a motion-captured CG film, particularly at a mere glance. Of course, the uncanny valley is most squarely focused on human likeness — the term was coined in 1970 by robotics scientist Masahiro Mori to explain why we find technological facsimiles that look like us so disturbing — and breaking free from the confines of the uncanny is where Crytek has arguably labored the most, creating some of the most realistically rendered game characters to date.

When it comes to believability, this focus is to be expected.

“The biggest [factor] to get over the uncanny valley is definitely the facial animation,” says Crytek US engine business development manager Sean Tracy. “That’s the thing that breaks more often than anything, is the faces of the characters.”

A player’s sense of unease tends to go from non-existent to extreme is when a character face itself breaks, like, say, when a glitch causes a character to clench their teeth in an unnaturally horrifying open-mouth smile when they’re supposed to be speaking lines of dialogue. (This has actually happened to me in a big budget game, though it wasn’t Ryse.)

These breakdowns can be the result of having a low number of vertices (making up an array that defines the edges and shape of a rendered object) available to influence each bone in a model’s facial skeleton. The less vertices per bone, the harder it is to accurately mold complex layered textures like skin. Typically four is the number that animators use, but Crytek has doubled that, developing what they call “eight weight skinning”.

“When you have a really dense face, it’s a little bit tricky to only have four bones influence a single vert because you can’t do the folds, you can’t get things around the nose or around the mouth deforming the way you would really expect it to deform,” Tracy says.

Crytek has been working towards perfecting realistically rendered faces for years in games like Ryse and Crysis 3 (right).

Eight-weight skinning is only part of Crytek’s realistic face equation. In addition to motion-capture, which the team did with the help of an outside effects house, the next step is using corrective blend targets. Think of these as kind of a composite crafted by the engine choosing from a library of facial models based on the current animation of a character’s face.

That library of “morph targets” is how animations were typically done before tech advancements changed the game.

“Basically you would have maybe 90 or 100 models of this face in different sort of shapes, so he might be saying ‘O’ or ‘Yea’ or whatever those different phonemes are that we want from the lips,” Tracy says. “So in the past you would actually just blend in different morph targets depending on what he’s saying.”

Now that more primitive process is coupled with the performance capture data.

“When we’re doing a certain bone animation — for example when [protagonist Marius Titus] is screaming, we’ll actually blend in a sort of screaming morph target during the bone animation. So what happens is you get a mix of the morph target, plus this bone animation,” Tracy says.

That may all seem pretty technical, but the result is a face free of any unnatural mathematic tearing at its seams — sort of an animation equivalent of using Photoshop’s healing brush.

“That’s why these are corrective,” Tracy says. “[We’re] sort of fixing the mouth so it doesn’t get completely torn apart, because typically in games you do lose some control of the vertices, especially on the outer edges, so [Marius’] mouth might look way too wide or something.”

Complications notwithstanding, there’s no one-step solution to effectively combining bone animations and corrective blend targets.

“There’s not a lot of magic in terms of technology for the facial system. That’s a lot of really hard work by a lot of artists in Ryse.


Crytek may be getting close to escaping the uncanny valley, but rather than coming from a desire for a deeper reaching philosophy, their pursuit towards realism has always been a point in and of itself. With big budget game productions yielding shorter and shorter experiences, it’s important that fans get the highest production values possible for their money, says Tracy, adding that the company’s direction with the realm of the photoreal has always best reflected that.

You could practically live here (though you probably wouldn’t want to).

“With Cevat, it’s always been photoreal — it needs to be believable, it needs to be immersive,” Tracy says, referring to Crytek president and CEO Cevat Yerli. “That’s kind of our vision, if you would. Is trying to get to pure CG quality in real time. And honestly we’re very close.”

A key difference that widens the gap between Crytek and their competitors is that the Frankfurt, Germany-based developer goes out of their way to keep as many aspects of their tech in-house as possible, so compatibility issues with third party middleware applications (for instance, the Unreal engine’s Simplygon tool, a graphical scaler that renders scenery in greater or lesser detail depending on how close the player is to an object) are never an issue. It’s a methodology that highlights the extreme attention to detail they bring to their work.

This high-end direction is nothing new. Their Crysis series has been lauded countless times over the past several years for its performance and detail when running on a souped-up PC, though with the advanced processing capabilities of next-gen hardware, Ryse is something of a benchmark for console titles.

Every ounce of power is needed, too. Apart from the game’s various animation techniques, Tracy says other components contributing to Ryse’s rich output that unsurprisingly eat up a lot of CryEngine’s bandwidth.

“That’s kind of our vision, trying to get to pure CG quality in real time. And honestly we’re very close.”

Probably the other biggest contributing factor to making Ryse look as photoreal as it does is realistic lighting. Crytek’s solution here, called physically-based shading, it’s something that they have been building towards for years. In a nutshell, physically based shading renders a world with realistic lighting — that is, lighting that accurately reflects, refracts and diffuses according to different types of materials it touches.

“Classically in games this is actually a really tricky thing to solve, because all our materials react very differently to certain types of lights,” Tracy says. “Whether it’s fire that’s flickering or whether it’s the sunlight. So what we needed was consistency across the entire game.”

With the recent advancements in next-gen hardware, the team was able to get this computationally aggressive set of algorithms up and running, and Tracy says the abundant juxtaposition of metal and non-metal materials in Ryse made it a good candidate to put their physically-based shaders through their paces.

In the past, any photorealism Crytek was able to achieve was hampered by tech that wasn’t quite there.

“As soon as you’d actually do anything in the world with it in real time it would sort of break down. You need those shading rules while the light’s rolling over the surface — how it’s gonna react to a different index of refraction,” Tracy says. “But once you have an entire game that’s actually physically based, not only do you have a photoreal game, but you also have a photoreal game that can move.”

There are numerous other facets of the CryEngine utilized to create a realistic space. Tracy explains how CryEngine’s own LOD, or level of detail, generator frees up processing power by only rendering as much detail as is needed in relation to the player’s distance from any given object, similar to Unreal’s Simplygon; he says it’s not actually the polygons that gobble resources, but the amount of different materials present in any given rendered in a model (so, for example, wood, glass, metal and other materials inside a building).

With advanced rendering tricks like geom caching, Crytek could soon revolutionize how in-game body and facial animations are achieved.

Ryse is also using a system called geom cache, which seems to have the quiet potential to be a revolution for rendering technology. Essentially, any time an in-game action requires movement — whether it’s an explosion of flame, waves crashing against a beaten shore or even potentially the primal scream of a Roman warrior — that animation requires a skeleton from which to build. Geom caching eliminates that requirement.

Geom caching works based on rendering techniques used in motion pictures. The film equivalent, called Alembic, pulls a point cache — essentially storing the positions of vertices on any given asset as a series of points that can be used in-engine — once per frame.

Geom caching does this in real time by throwing out gobs of duplicate data that overlap frame-by-frame, thus freeing up enough memory to run these so-called prebaked animations in real time. The result? The point cache data from the animation’s vertices replaces the need for a skeleton, saving designers a significant amount of time.
Tracy says he hopes Crytek can further develop geom caching to replace the need for morph targets in facial animation, among other developments the company isn’t talking about yet.

“Once you have an entire game that’s physically-based, not only do you have a photoreal game, but you have a photoreal game that can move.”

“In the future what we hope to see is expanding the geom cache system to try to do facial and things like this, because again if we can get rid of the bones out of the face and not use morph targets and use something that’s even more advanced — that would make all the sense in the world to do.”

In any case, Ryse’s sophistication will likely soon be outclassed by whatever Crytek is doing next. Continual advancement is perhaps their real philosophy, and Tracy says they don’t plan on stopping any time soon.

“It’s never going to be like that for Crytek,” he says. “As we finish one piece of tech there’s ten other pieces of tech we’re wanting to work on or wanting to research.”

Nor has the developer’s pursuit of realism yet resulted in a fully-realized digital human replica — or even any would-be simulacra that may lie beyond.
“And I still don’t think we’ve totally overcome the uncanny valley,” Tracy says. “I think it’s gonna be awhile before we can actually break through that.”

Next Story — Getting Rich Quicker
Currently Reading - Getting Rich Quicker

Getting Rich Quicker

‘The Counselor’ is too real to cure your ‘Breaking Bad’ hangover

Obligatory spoiler warning: there are spoilers. You’ve been warned.

The opening titles of Ridley Scott’s The Counselor shuffle through an intimately known aesthetic touchstone: the wild, desolate wastes of the American Southwest. Intercut between credits Scott paints a particular portrait with shots of seemingly disparate imagery — a biker flying down a hot, flat stretch of road, two Mexican haulers routing a septic truck towards parts unknown, a pair of cheetahs hunting jackrabbits against the red cliffs and scrub of the Martian Texas outback.

Ok, so maybe the cheetahs are little exotic for genre work, if you can call The Counselor that, though these wild cats are there for a reason. This is man’s country, Cormac McCarthy’s bread and butter, and with his debut screenplay here the Pulitzer prize winner indulges in a lawlessness that’s as raw and terrible as it is an appeal to a certain brand of American individualism, consequences be damned.

Ridley Scott’s Western iconography is gorgeous and also a kind of subversion of expectations.

As timeless as they are, it’s been awhile since we shared a long cultural embrace with Western motifs. Neither thematic implication (nor McCarthy’s return to the vicious frontier pragmatism of No Country For Old Men) are the only reasons this all seems so familiar. Taken at individual face value, you could also mistake any scene of The Counselor’s title sequence, without immediate context, as a stand-in opener kicking off a lost episode of Breaking Bad.

Appreciable identifications aside — Breaking Bad’s often detached intros were similarly incongruous for a show ostensibly dramatizing a Southwestern meth empire — it’s through Vince Gilligan’s crime series that the American consciousness has enjoyed its longest-lasting relationship with the Western in years. The amoral, often despicable chemistry teacher-turned-druglord Walter White is a modern-day cattle rustler if ever there was one, and fans reveled in his self-serving and increasingly drastic machinations.

There’s a reason this all seems so familiar.

“There is gold in the streets,” Walt rapaciously lectures early in the show’s final season, shortly after detonating a pipe bomb in a nursing home to rub out his former-boss-cum-opposition. “Just waiting for someone to come and scoop it up.” For the average viewer, watching Walt take back his raison d’etre after years of spineless existence in suburban hell was the perfect vicarious escape.

More thrilling than your life, isn’t it?

There’s a kindred sense of seizing one’s destiny (and underlying greed) that runs through The Counselor, which follows an unnamed lawyer’s one-time foray into the underworld of drug trafficking. While McCarthy is well-versed in this kind of territory it still feels like a Breaking Bad reverberation, something fans have sought out in various forms since the series finale in September. The Counselor’s El Paso-Juárez split may as well be Walt’s Albuquerque, its coke meth. The film itself is small and violent enough to play out exactly like a self-contained episode.

That hasn’t kept the film from facing a confused, somewhat tepid reception. Critics have mostly tripped over McCarthy’s bizarre, meandering dialogue, which slips in and out of cynical existential soliloquy at random and invokes more dramaturgy than straight screenwriting. Yet despite its dense script and unconventional approach to well-worn subject matter (you won’t really grasp what’s going on until almost an hour into the film), marketing expectations bill The Counselor as a fairly straightforward thriller from an A-list director.

Instead of Breaking Bad you get something closer to a drug trade documentary. The Counselor’s progression is a spare, slow build, with much of its plot taking place merely in conversation rather than Hollywoodized action. It all goes wrong very quickly, as a miscommunication leads to a missing shipment of cocaine (that sounds more benign than it is), which riles up some pissed off Mexican cartel. When the film’s savagery finally kicks its execution is graphic and joyless. Scott’s slick camerawork notwithstanding, there’s no room for escape here.

Coincidence.

That #HaveYouBeenBad hashtag the film’s marketing team used also misled expectations, taking advantage of what’s essentially a throwaway line in what certainly can’t be a coincidental echo of Walt’s world. The Counselor is what would happen if someone tried to break bad outside of Gilligan’s writer’s room, and its expectedly brutal consequences are probably a reflection of why no one seems to like it much.

Michael Fassbender’s Counselor himself is basically the antithesis of Walter White. Walt’s chemistry knowledge and ingenuity seem expertly suited for a life of crime, his Machiavellian manipulations and quick thinking making for highly entertaining television.

The Counselor is conversely much less prepared to involve himself in the drug trade, even on a temporary basis. He only knows the business from a legal end, and is more or less blind as to what could happen if things don’t go smoothly. It sounds easy, but then it always does: a single in-and-out deal with Reiner, Javier Bardem’s shady nightclub owner and the Counselor’s underworld connection, that’ll to make life perfect for his new fiancé, Laura.

Nor does the Counselor really even seem to need the money. Although he tells Reiner he’s financially “against the wall,” neither Scott nor McCarthy gives any notion of the details, let alone the Counselor’s struggle.

The Counselor gets involved in some shady dealings. How would you do?

We do get outward appearances. The Counselor has a country club membership, a GQ-spread closet full of impeccably cut suits and the kind of modern, minimal living space that few experience outside of the movies. Early in the film he flies to Amsterdam, diamond shopping for Laura’s engagement ring. When said deal he’s using to finance his happiness goes south, he quietly goes to pieces, completely out of his element.

Unlike Walt, the Counselor is nearly a blank slate. Fassbender plays him with a nonchalance and an air of smug-if-subtle control, though McCarthy never really affords him any. Accordingly the film doesn’t give the audience any connection — or any release, either.

There’s no room for escape here.

The Counselor is warned off more than once, essentially being told flat out this is a bad idea. In the scripted realm of television, it can be hard to shake the feeling that there’s always some narrative trap door the protagonist can take to flee from harm. Whenever it mattered for the sake of tension, Gilligan gave Walt that. McCarthy isn’t so merciful.

“What do you think I should do?” the Counselor asks Reiner late in the film, desperate for some guidance in shark-infested waters. “I don’t know, Counselor,” Reiner says. “I don’t know.”

Variations of this conversation appear multiple times throughout McCarthy’s script, echoing what any average law-abiding upper-middle-to-upper class citizen might say or do if they were in the Counselor’s shoes. For most, the outcome would probably be the same.

Same difference.

Interestingly, nearly everyone in The Counselor seems as clueless as Fassbender. Bardem’s ostentatious, affable Reiner is oddly harmless — he’s afraid of his lover (Cameron Diaz, more or less playing the younger sister to Kristen Scott Thomas’ character in Only God Forgives [NSFW]) and doesn’t even seem capable of skirting the law, even when he’s casually discussing horrifying instruments of death.

Westwray, Brad Pitt’s smooth talking Western-wear middleman, somehow believes he isn’t implicated, even after the Counselor is. Both Reiner and Westwray end up dead, the latter in an excruciating motorized bolo-piano wire-esque device that, if it’s not just a product of McCarthy’s imagination probably should be.

And the Counselor himself? He ends up paralyzed with fear, holed up in Mexico after Laura is taken by the Cartel. One night he gets a package with a DVD that says “Hola!” on it—you can guess the unwatched contents. McCarthy leaves the Counselor’s ultimate fate a question mark, but either way it’s pretty bleak.

Even the smartest men can be fools.

Another somewhat ponderous crime film marketed as a thriller, 2012’s Killing Them Softly, received an analogous response. It didn’t have Breaking Bad’s sensibilities either. Not even close.

Hey, it’s Hank!

Here the Western iconography is maybe a little too close to Breaking Bad’s Albuquerque (we even get a cameo from Dean Norris as a Hank-ish character, only acting on Walt’s side). If Gilligan’s take on the genre was free license to run rampant through the criminal unknown, The Counselor’s is a bullet to the head. Compared to Walt, the Counselor proves that even the smartest men can be fools.

“The best westerns are about man against his own landscape,” Scott recently told the New York Times in an interview for the film. Sometimes for the audience that landscape is an ugly mirror.

Next Story — Glancing Past Walt
Currently Reading - Glancing Past Walt

Glancing Past Walt

Long after it’s over, Breaking Bad’s legacy will be a case study in gratification

Warning: Spoilers ahead. Don’t ruin it!

It’s impossible to escape the maelstrom of Breaking Bad. It’s likely you’re friends with or know someone who watches it. Chances are probably even better that as of this writing at least one of them is waiting with a tightened gut for the series finale to air this Sunday.

If you’ve visited just about any site with anything resembling media savvy since the mid-season premiere in early August, you’ve undoubtedly encountered countless reactionary articles devoted to the show’s meta-cultural presence. No stone’s been left unturned: the series’ effect on television, outlandish prediction pieces, articles bemoaning the many oh-my-god moments that have proliferated the back half of the final season with increasingly alarming severity – and let’s not forget Buzzfeed lists.

#TreadLightly is just one of the hashtags that’s cropped up during Breaking Bad’s final season.

That’s to say nothing of social media, which for any unfortunate fan that hasn’t immediately tuned in over the past two months has become a razor’s edge minefield of spoilers, memes and trending hashtags referencing in some capacity whatever the most recent shock was.

The implicit threat of Walt’s “tread lightly” has rung true all season online – the virtual watercooler effect for Vince Gilligan’s Shakespearean-tinged crime saga has been so hazardous for latecomers that Dean Norris tweeted fans to “stay the fuck off[line]” if they weren’t caught up. (Amusingly, Norris then made fun of anyone who didn’t listen, just as Hank probably would.)

I’m not here to insert my opinions about the end of Breaking Bad. As much as I love the show, once the screen cuts to black for the last time, we’ll move on, forgetting about the fates of Walt, Jesse et al. This has happened before and it’ll happen again.

The narrative threads that must somehow be tied up after last week’s penultimate episode – the ricin, Jesse’s horrifying imprisonment, Walt’s thirst for revenge against Uncle Jack and his crew – are emotionally salient, sure. Nor will I dispute the widely held claim that the series is among the best written, paced and acted ever created, its often excruciatingly tense scriptwriting notwithstanding.

Once the screen cuts to black for the last time, we’ll move on.

Of course there’s no shortage of other excellent television dramas being produced today, all vying for our already-addled attention equally, from Game of Thrones and Mad Men to Downton Abbey, not to mention business model outliers like Netflix’s House of Cards.

That these shows don’t all air at the same time of year is a minor godsend, preemptively preventing a TV junkie from having a meltdown attempting to consume too much too quickly. (More to the point for network execs, a spread out schedule also decreases the chances of an aggravated ratings battle royale among the heaviest hitters.)

But Breaking Bad’s dramatic climax, whatever it may be, doesn’t tell its whole story. It’s not the most interesting aspect of the show and it certainly isn’t the most important. Narrative relevance aside, the depth of its legacy is how the series has captured and held its place in the zeitgeist as arguably few others have in modern times.

Say his name.

There have been a lot of articles analyzing Breaking Bad’s slow-burn Nielsen upswing since its 2008 pilot. The show was taking in around 1.3 million viewers by Season 2, which rose steadily to just under 2 million by its fourth season. Last summer’s fifth season premiere jumped to about 2.9 million viewers, and it’s averaged about 5.2 million for its final eight episodes, propelling to steadily greater numbers as the series draws to its inevitable close.

Hungry?

Netflix shoulders a lot of the blame here. The streaming giant unleashed the first three seasons for mass consumption in 2011, adding Season 4 last July, just in time for hungry fans and curious newcomers to choke down the whole series before watching the final episodes begin to unspool in real time.

Netflix is nothing if not an instant gratification platform. Like addicts getting a taste of Walt’s product for the first time, the ease of being able to sample Breaking Bad’s wares in a low-risk environment – if you don’t like it, turn it off ­– was an irresistible offer for many that hadn’t experienced the show before, myself included.

Though Netflix doesn’t release viewing numbers, they have said that 50,000 subscribers binged all of the fourth season the day before the final season’s July 15 premiere. And while that may not seem like a lot out of the streaming service’s nearly 40 million global customer base, that’s a single day a full year after the show’s Netflix debut.

AMC has done their fair share of promotion, too. They’ve run series marathons, repeated episodes before premieres, marketed video releases and offered on-demand service, promoting the hell out of their chemistry-teacher-turned-meth-kingpin throughout the life of the series, though nothing affected the numbers much at first.

AMC ingeniously made us wait a year to find out what Hank would do following the S5 midseason finale.

Upsetting as it may have been for fans, it was an ingenious move on the network’s part to split the final season into two halves. The agonizing, year-long wait to find out what would happen following Hank’s realization of Walt’s true identity increased the show’s zeitgeist profile, primed to explode with impact over the last eight episodes.

As if in response, AMC has since pulled out all the stops to make sure you are aware of Breaking Bad: social media and websites have been bombarded with abandon, cast appearances scheduled, the show’s official website stuffed with comedic mini-episodes, plus that actual, serious discussion over the essentially greenlit Better Call Saul spin-off.

The network went as far as having Bryan Cranston do a dramatic reading of Shelley’s “Ozymandias” as a pre-premiere promo, the thematic concerns of which should be fairly obvious to anyone who’s paid attention to how the last season has unfolded. Meanwhile, Netflix again added the first half of Season 5 in early August in advance of the mid-season premiere – none of the timing here was by accident.

What this all adds up to is that Breaking Bad currently occupies a significant place in the national conversation (winning its first Emmy on Sunday for best dramatic series didn’t hurt), to the point where anything slightly related is deemed worthy of interest, running the gamut from goofy parody to New York Times-enlisted fictional articles. Yet its status there is fleeting.

Promo art: a small yet inescapable part of Breaking Bad’s fifth season marketing machine. #Bitch.

Like Netflix, cultural trends are a form of instant gratification. In the era of instantaneous communication and unlimited sharing, the consumption of media forms is fueled by what’s captured the public imagination, from television to viral Youtube videos. And like all pop artifacts, our obsession ­– and right now Breaking Bad fever really is one – only lasts as long as our attention spans allow.

Walt and company aren’t the only example on television, either. Take “The Rains of Castamere,” the ninth episode of Game of Thrones’ third season, better known as the Red Wedding. The episode is an all-time high for a series already known for its brutality, the now infamous scene depicting the vicious murders of Robb Stark, his pregnant wife, mother and numerous clansmen in one fell swoop ranking as one of the most graphic displays of bloodshed in television history.

When the episode aired “#RedWedding” instantly started trending on Twitter as the internet exploded with outbursts of shock and outrage. Unlike Breaking Bad, Game of Thrones has George R.R. Martin’s original literary source material to draw from, and arguably just as large a contingent of fans were waiting with queasy anticipation over how Walder Frey’s treachery would be shown on screen.

The Lannisters send their regards.

That difference didn’t stop a whirlwind of bloggers, TV critics and social media users from reacting in the most public ways possible. Even if you had watched Game of Thrones and didn’t know what the Red Wedding was, you probably knew it was coming.

So the internet mourned, and we all took a broad moment of silence, letting the conversation lapse for a moment. A week later, no one cared. The zeitgeist had shifted.

Our obsession with pop artifacts only lasts as long as our attention spans allow.

Downton Abbey, too, enjoyed its own moment of social media uproar after Dan Stevens made it very publicly clear that he would be leaving the show at the untimely end of Season 3, making showrunner Julian Fellowes kill off yet another character. Other than a few “What’s going to happen to Lady Mary?” articles in advance of the just-underway fourth season, the sustained public focus on Stevens’ departure has been negligible.

The continuously evolving media and tech landscapes don’t help this lack of focus. The modern collective consciousness lives to consume and boredom comes easily. It’s a problem that’s seen across spectrum of instant gratification, spilling far beyond the bounds of media consumption.

Social interaction, 2013.

Texting and tweeting becomes so fast and so second nature for many that our brains have been hard-wired to hit “send” without bothering to check for accidental autocorrect errors. In public, people are glued to their smartphones, checking them incessantly as they wait in anticipation for that sweet hit of dopamine that means someone cares about them, or is at least keeping up the appearance of caring, for a few seconds.

The Canyons made a great commentary on this by giving the smartphone its close-up, but no one paid any attention to the film for more than a cultural minute, and it only received that much scrutiny because of the Lindsay Lohan factor. It’s the era of the Glance – nothing current matters much when something new is threatening to appear on the horizon. Why do you think Apple pumps out new announcements every six months? Whatever your preference, it’s all too easy to be a consumption addict.

Miley Cyrus’ “Wrecking Ball” destroyed Ylvis’ “The Fox” in Youtube views by almost 100 million hits. Wonder why.

The battle behind 2013’s would-be song of the summer is another particularly salient example. “Blurred Lines” had seemingly taken the crown in the eyes of most in whatever influential industry circles, after Robin Thicke’s NSFW video helped the song go viral in a major way. Then came Miley Cyrus’ performance at the VMAs, and all of sudden Thicke was culturally twerked out of the spotlight, ironically during his own song. Shocker: in the aftermath twerking went viral itself.

That wasn’t good enough for Gawker. Early this month they independently made the somewhat laughable claim that Ylvis’ latecomer “The Fox” – a weird Norwegian send-up of the stereotypical pop song that was basically a pre-packaged meme waiting to happen – that would actually claim the coveted song of the Summer throne.

Less than a week later, Miley calculatingly re-entered the conversation with her own new hook-heavy, dub-breaks single “Wrecking Ball,” in which she licks a sledgehammer and straddles the eponymous piece of construction machinery sans clothing. (Considering the two songs respective 54 and 153 million Youtube views, it’s safe to say that when it comes to the instant gratification, tits unsurprisingly always win.)

We live to consume and boredom comes easily.
It’s the era of the Glance.

What’s really fascinating about Breaking Bad is that it’s not AMC’s biggest show – not by a long shot. The Walking Dead regularly trumps Walt’s meth operation with upwards of 12 million viewers, more than double Breaking Bad’s season record.

So why isn’t there as much buzz about the zombie Robert Kirkman adaptation? Probably because the critical reaction has been mixed, with behind-the-scenes drama with showrunners persisting and consensus opinion left less than uniformly impressed. The Glance, it seems, will not be denied.

The writing produced by Vince Gilligan and his team has never hit such lows as The Walking Dead’s, even at its most flawed or ridiculous moments. Considering Breaking Bad has been written week-to-week throughout its run (with the exception of Season 2), it’s amazing that the panicked, runaway train feeling the show’s arcs sometimes have never derailed its momentum.

Whatever Walt’s fate may hold, a comparative few will be saying Heisenberg’s name in a matter of months.

Since there’s no shortage of consumption opportunities, it makes sense that only the most gratifying sources of entertainment survive. If Breaking Bad has taught us anything, it’s that the culture machine is insatiable. We may mourn for Walt and Jesse after Sunday, and it doesn’t really matter that in month or so we’ll have forgotten about them.

The next obsession is always just in the periphery.

Next Story — A Deeper Valley
Currently Reading - A Deeper Valley

A Deeper Valley

Despite a decade of improvements, visual effects in film remain a less than perfect deception

With a seven year gap since Alfonso Cuarón directed Children of Men, it’s no surprise that his return to the screen in next month’s Gravity is hotly anticipated. Warner Bros. is (likely to Cuarón’s specifications) being cagey on the film’s details, releasing only a handful of trailers that cover only snippets of what seems like a very small narrative set-up revealing Sandra Bullock’s character fighting for control after a violent collision in zero gravity.

Response to the carefully cut shots that have been released has been overwhelming positive, with critics and film buffs abuzz over the claustrophobic terror that lies at the core of Gravity’s public introduction. It’s a notion I really want to get on board with, and also one I’m having a hard time accepting.

Interestingly, Gravity’s sci-fi aims are realistic – its premise looks simply like an exploration of two people coming to grips with the existentially horrifying scenario of panicked survival against the unforgiving vacuum of space. This is a better match for Cuarón’s narrative tastes than, say, Ender’s Game would be.

Alfonso Cuarón’s directorial style is both organic and intense. (He also loves tracking shots.)

His Harry Potter installment The Prisoner of Azkaban was arguably one of the more human entries in the film series, balancing necessary CG elements with the revelatory weight of Harry’s lineage and Cuarón’s organically tempered camerawork. (Children of Men, shot in a space somewhere between POV and verité style, is an even stronger example of how the director reflects the underlying humanity in his narratives through technique, and includes one of the most beautiful and harrowing continuous tracking shots ever put to film.)

This isn’t a new trend.

Pathos is what makes these stories work so well, and the technological limitations present in Gravity’s contemporary setting seem to promise an imminent, character-driven intensity rather than big, dumb sci-fi popcorn. There is no armada of megaton, fusion-drive leviathan, no alien menagerie, no humanoid androids just off-kilter enough to trigger an inner neurological response here. That’s not a bad thing.

But for all its grounding, Gravity isn’t quite out of the bounds of its own uncanny valley: the virtual environment it takes place in.

You’ve probably experienced the effects of the uncanny valley. A psychological concept floated by Japanese robotics scientist Masahiro Mori back in 1970, it attempts to explain why we find artificial or humanoid reproductions unsettling compared to a living, breathing human.

Tron Legacy’s young Jeff Bridges is pretty creepy.

The most common exposure is probably seeing CG characters in films, and the valley’s boundaries should be immediately evident to anyone that’s ever watched an effects-based creation in a movie and found it more creepy than convincing. (And the more “human” these rendered lifeforms try to be, the less we’re generally convinced of their authenticity.)

Since we can’t reconcile reproductions of ourselves with how our brains already know humans look and behave, the uncanny valley has been restricted to the existential confines of whatever’s made in, or at least closely resembling, our own image. The results aren’t guaranteed to work.

Earlier efforts like the cast of Robert Zemeckis’ Beowulf or David Fincher’s de-aging Brad Pitt in Benjamin Button feel more like shiny facsimiles rather than physical beings (to be fair, Beowulf was completely CG, though it touted then state-of-the-art mo-cap of the actors’ faces).

Which dips deeper into the valley?

Of course, the sheer amount of detail visual effects can push increases with every passing wave of tech advancements, and it’s become significantly easier in the past few years for filmmakers to push rendered characterizations in a way that’s believable on-screen. But there’s an inherent insult in all this facial and bodily progress, too. With the past decade having seemingly been spent almost entirely on making improvements to “living” visuals, whatever art direction remains has suffered.

This isn’t a new trend. When George Lucas revisited Star Wars for the series’ needless, abortive prequel trilogy, two of the three new films were shot all-digitally and each were stuffed with so many CG created sets and setpieces that the actors may as well have been running around in a ‘90s point-and-click adventure game.

The result – a universe whose abrupt divorce from the reality of palpable sets and locations was replaced with a digital landscape nearly 100 percent made from computers – is about as credible as it is easy to take seriously.

With Lucas having also digitally tampered with the original trilogy so many times already in various video releases, that’s a long pall cast over the series, and one Disney is going to have to approach with a lot of caution in order to rise up.

Some legacy.

It’s like I’m really there!

Compare that to the original Lord of the Rings trilogy, which counteracted CG additions to its Middle-Earth with both the natural beauty of New Zealand and some amazingly detailed sets the cast could walk on, not to mention a population of thousands of flesh-and-blood extras in makeup and costume. Which universe seems more real?

It’s not uncommon for genres like sci-fi and fantasy to rely on dense, fantastical worlds whose construction costs would be astronomical if spaces were primarily handmade and used in mostly location-shot scenes in order to bring them to life on-screen. Movies are a tech-powered industry, and it’s clear that a certain amount of visual effects are needed to pull off some types of fiction. That doesn’t mean anyone should get an automatic free pass to abuse CG as a catch-all. Yet this is exactly what Hollywood seems to be skewing towards.

No one should get a free pass to abuse CG as a catch-all.

Not even Peter Jackson is immune. Thus far his return to Tolkienian fantasy with The Hobbit has felt generally hollow and lifeless, thanks to an incredible amount of superfluous CG. Rather than once again capitalizing much on terra firma with a mix of practical and digital effects, the Middle-Earth of 2012’s An Unexpected Journey is almost unrecognizable.

Lacking any sense of physicality, the film’s dwarven cities, elven sanctuaries and underground goblin shantytowns alike are broadly painted as videogame cutscenes rather than tangible locations of wood, stone and steel. When held up to the original trilogy, the difference is jarring. It was heartbreaking to see that familiar realm I so loved in Lord of the Rings reduced to so many 48 FPS-tailored pixels, and I found it almost impossible to not be repeatedly taken out of the story because of it.

Sir Ian McKellan, Cate Blanchett and a whole lot of not much else.

On a purely visual level, Disney has been particularly egregious of late, making fans wonder just how much they should believe claims that Star Wars Episode VII will focus first on story and characters over effects. Their last two big fantasy films haven’t done them any favors.

Their 2010 Alice in Wonderland redux, along with this year’s Oz The Great and Powerful, could just as easily have done away with real actors and locations for all the tactile difference they made. (Tim Burton’s Alice in particular is a creepy cartoon circus that echoes Lewis Carroll’s manic language with an aesthetic that’s as plastic as it is hideous.)

At least Avatar had the decency to wear its tech guts on its sleeve, with James Cameron taking every possible opportunity to flaunt the film’s overwhelming, all-encompassing 3D-fueled setting of Pandora. No one that contributed to the film’s $2.7 billion worldwide box office had any doubts that they were going to see a nearly-all CG spectacle.

More importantly, for all its flaws (and unlike the Star Wars prequels) Avatar’s effects were classy enough that it was all too easy to forget you were ever watching humans on a soundstage interacting with virtual objects against a greenscreen.

Alice in Wonderland is one of the most egregious recent examples of environmental uncanny valley abuse.

You can still find directors that prefer to shoot scenes the old fashioned way whenever possible. Take the hotel fight in Christopher Nolan’s Inception – had Nolan merely mocked-up the scene with CG, it would lose a crucial element of perceived realism.

Joseph Gordon-Levitt wouldn’t have appeared to be defying gravity against the hotel’s rapidly shifting gravitational center while fending off the dreamworld’s thuggish projections. He’d look like he was flipping through a virtual space that was being artificially altered around him. Instead, Nolan used a giant rotating set so that he could shoot the scene with authenticity.

It’s these films – the ones that continue to employ miniatures, models, environmental constructions (and, yeah, real explosions) concurrently with CG flourish – that will stand the test of time. Inception is in many ways an unprecedented visual achievement, and I doubt it’ll have lost that 20 years from now.

Christopher Nolan used an actual rotating set to shoot Inception’s amazing hotel fight scene.

Blade Runner’s 2007 Final Cut, the most polished re-release of Ridley Scott’s seminal 1982 sci-fi masterwork, looks in many ways like it could have been released this year when played back in 1080p. Scott’s Prometheus similarly matched any CG foregrounding with a sumptuous backdrop of stunning landscape shots and locations that had a presence and weight you could feel. Bit by bit, filmmakers like Nolan and Scott are becoming outliers.

You should see it in motion.

Gravity’s script, which follows two astronauts involved in a disaster with the International Space Station, could perhaps have been shot using other methods, though I can somewhat understand Cuarón’s desire to bring a film set in an oxygen-free void to life through heavy visual effects. Reportedly the director’s only other acceptable option was filming both Bullock and George Clooney suspended in temporary zero-g aboard an aircraft that would break through the stratosphere before entering freefall. Not exactly a walk in the park.

I want to be wrong.

Cuarón’s method solution for Bullock was to put her in a cube smaller than a jail cell for eight to 10 hours a day to mimic the extreme isolation of floating through space, strapping her in with a harness and filming performances with a camera that would accelerate straight at her to within an inch of her face.

Yet I can’t help but shake the feeling that for all the methodology and in spite of Cuarón’s considerable talent, Gravity’s CG could taint the film’s essence. How are we supposed to go on an emotional journey when its physical world feels so distracting?

Hopefully Cuarón’s visionary ability will overcome the inherent visual limitations of Gravity’s CG.

Given Cuarón’s past films, I want to be wrong about my initial impressions. Hell, I’ll be glad if I am. If anything can assuage my fears and pull Gravity beyond the possible limitations of its visuals, it’s the director’s vision. (After the trauma of An Unexpected Journey, I’m thinking this year’s Desolation of Smaug will probably be more of a lost cause.)

That said, it’s pretty damn hard to merely watch a trailer for Gravity and not have my immersion instantly broken when Sandra Bullock’s distinct face suddenly appears in what otherwise looks like a big-budget space disaster computer simulation. When the uncanny valley begins breaching even more serious contemplations, it may be time for an evaluation of judgment.

Sign up to continue reading what matters most to you

Great stories deserve a great audience

Continue reading