After arriving in Los Angeles for E3’s annual industry extravaganza, I walked around a lot and wondered what the hell is happening to videogames.
“Steve. Steve. You’re actually here, so you know what that means. I’ve gotta show you Bioshock.”
I’m in Portland, sitting on a couch in my friend’s basement apartment. It’s been a week since I left Los Angeles following the close of June’s Electronic Entertainment Expo, the biggest professional videogame trade show in the country and one that I’ve grown increasingly weary of covering over the past four and a half years. I sigh, rolling my eyes. I haven’t been home to Seattle now in almost two weeks. I’m a little bit drunk and I know there’s not really going to be any convincing my overzealous pal out of what’s about to happen. He takes my non-response as a sign to proceed, grabs Bioshock Infinite from the shelf, throws it in his 360 and plops himself down on the couch next to me. I reluctantly take the controller.
I’ve been avoiding the newest Bioshock, more or less, since a couple months before its release this past March. The way the videogame industry works is that once a huge, mega-hyped release is just over the horizon, the frenzy of anticipation becomes like a black hole – regardless of personal involvement, the only way to escape some type of exposure is to get offline and turn off your phone.
After Booker DeWitt climbs a mysterious lighthouse and is rocketed tens of thousands of feet into the air to the floating metropolis of Columbia, I am reminded that, in theory, I don’t really have anything against Infinite. In a not-unexpected display of graphical prowess, my introduction to the city fills me with a mix of confusion and wonder. Perhaps Booker feels the same as his metal cage shoots past zeppelins shining in the high sun and levitating structures that loom with imposing presence and ornate architecture. The theme park world feels like something Walt Disney would have created in an alternate timeline of history, if the man had been an unhinged sociopath with plans for unprecedented cultural eugenics.
When the ride slows to a stop, I’m deposited into a sunlit alcove with a central mural and a backdrop of soft choral tones. I already know what awaits. Infinite’s baptismal ceremony has been the subject of much debate amongst the press, though given the game’s overt religious zealotry the inclusion is neither surprising nor offensive to me. In the brief moments before approaching the baptismal chamber, a long candle-lined path in ankle-deep water for holy induction, I’m nevertheless struck by the weight and implication of the notion of the words on the stained glass in front of Booker and me: AND THE PROPHET SHALL LEAD THE PEOPLE TO THE NEW EDEN.
As I step forward into alcove, a twinge of sadness hits me along with a familiar mechanical sound: an objective marker appears on the screen commanding me to FIND A WAY INTO THE CITY. The overly game-y intrusion makes me want to put down the controller. I am suddenly out of the moment, any emotion I was supposed to be feeling at the scene snatched away with the appearance of a cloying objective hint. I have a fleeting, vaguely blissful notion of quitting the game industry.
I begin walking the path of Columbia’s righteous. “Path” is the best descriptor here, as Infinite’s narrative claws are dragging me down this straight and narrow line where I weave back and forth, trying to jump over the candles on either side of me.
“You have to go forward, Steve,” my friend says. He is clearly annoyed, having to voice a design parameter we both already knew was indisputable. When the priest dunks Booker’s head and we have accepted the new prophet, I take my first steps into the suspended city with no particular enthusiasm.
Later, confronted by Columbia’s casually übermenschish patriotism, I shake my head. “See, I like the ideas here,” I tell my friend. “The themes and the art direction are really interesting, but the problem is that I know it just turns into a big, stupid shooter.”
“And that’s a totally valid complaint,” my friend says, agreeing with me for once. “It does turn into a big, stupid shooter.”
E3 2013, my fifth, was unexpectedly weird. It ended in mid-June and I’m still coming to grips with what it means. It wasn’t that E3 itself was so strange, it’s the whole industry. Even the genuine excitement that bubbled up following Sony’s PS4 blowout announcements felt like an oddity, a rare moment of enthusiasm that spilled over and fizzled out too quickly, like bad champagne. The consumerist mess that would follow over the next three days soon drowned it out.
It had been Sony’s moment to seize. Microsoft’s hour-long press conference had been concerned with assuaging the fears of hardcore Xbox 360 fans out for blood over the Xbox One’s staunch DRM measures. Before arriving at Los Angeles’ Galen Center for the MS presser that morning, the first thing I noticed from the cab window were two brightly colored McLaren MP4-12Cs advertising Forza Motorsport 5. May’s Xbox debut event in Redmond was all bros, TV and testosterone, so it didn’t come off as much of a surprise to be greeted by a couple hypercars casing the building. I remembered the disastrous Cirque Du Soleil Kinect reveal a few years back where journalists were made to wear brightly colored electric ponchos. I was glad I wasn’t there. With Sony’s conference still hours away, it seemed like 2013 was so far shaping up to be E3 as usual.
I don’t usually attend press conference day. Whatever the console makers and third-party heavy hitters have to say to a live audience is simultaneously broadcast as a streaming video, so you can just as easily watch the conferences from the comfort of a hotel room or a barstool. The scant news that comes out these things – often sandwiched between lame scripted banter that makes the Triple-A videogame industry strike depressingly close to bad Hollywood stereotypes – always hits immediately anyway.
At Sony’s conference, I snuck into a restricted “bloggers only” area to get a better seat – not that anyone was actually monitoring the handful of wifi-enabled tables. The setup was par for the course, the stage surrounded by an arcing array of television screens looping a sizzle reel of PS4 launch titles, the speakers pumping hit singles designed for mainstream appeal. After the Xbox disaster that morning, what reason did I really have to be here? I possibly just wanted to see firsthand what sort of counter Sony would have in the works for their Redmond competition. Or maybe I wanted to see a train wreck.
Then Sony Computer Entertainment America president Jack Tretton announced, not without a certain degree of smug (and deserved) satisfaction, that the PS4 wouldn’t employ any DRM or “always online” policies. The concept of digital ownership (or whether or not a purchased physical game disc is actually your property) had, up to that point, been coming under heavy fire. DRM and used game countermeasures, what the Xbox One was touting, would limit players from trading physical games with friends, a move that would make any next-gen console useless if not tethered to a constant high-speed internet connection. When the uproar of the crowd issued forth from the mass of journalists and professionals gathered in the Los Angeles Memorial Area, it really seemed worth being there, fake live-blogging the event. For a brief second, it almost felt like E3 might, for once, be worthwhile. Videogames might be OK.
The show floor of E3 at the LA Convention Center is something you gradually come to ignore. The two halls where most of the industry’s big budget, Triple-A muscle congregate are cacophonous cathedrals of colored light, deafening orchestrations of machine gun fire and gooshy carpet you could easily fall asleep on. This is a place devoted to the worship of the annual Call of Duty, Madden and Assassin’s Creed installments by way of the elaborate iconography of multi-million dollar booth displays – mechs and marines and monsters and scantily clad models calling out, appealing to one’s specific spiritual devotion. The throngs at mass: gawking fanboys, Gamestop stooges, camera crews and industry executives effortlessly all wearing the same white-or-light-blue starched shirt and dark-gray-or-navy-blazer combo. In many ways, videogames are a uniquely Californian industry and, if you don’t live in development hotbeds like San Francisco or Los Angeles, the glitz promotion feels a little less every-day. A theme park for yearly pilgrimage. We are come to worship at the altar and preach the gospel of videogames. And the prophet shall lead the people.
Attending E3 as a reporter, as much as the word applies when covering anything in the entertainment business, there is no time to be exhausted by spectacle or the endless mass of people. The crowds gathered to watch the multiplayer footage of Shooter X are obstacles. You view the path in front of you in dynamically changing angles, a series of real-time directions carved out towards the next appointment: cut through the left of this slow-moving group, shift right, twist your shoulder, thread across the comparative empty space of the booth next to you, lift up your messenger bag and adjust your weight slightly to avoid the onset of oncoming traffic, taking in the passing sights only as needed to find the quickest path to your next destination among the hubbub.
Briskly moving past the display for Capcom’s Dead Rising 3, a zombie cage populated with actors paid to shamble around and ravenously moan at passerby for the duration of the three-day show, I briefly entertained the thought of sticking my hand through one of the holes in the chain link at the undead girl licking her chops in my direction. How far would this LA performer go with her character motivations? Just like the location of each major publisher’s booth on the floor, the action year to year never really changes.
It is easy to feel the increasing drudge of E3 as the number of shows you cover begin to mount. The conference highlights the most bloodsucking aspect of the videogame industry while ignoring the fascinating designs and people and ideas that are pushing the medium forward. Games are less a medium here than a money-grubbing bottom-line.
Change has been gradual. The unveiling of a slew of new indie games for the next PlayStation was not a surprise once Sony announced a commitment to innovative, smaller-scale downloadable titles at the Game Developers Conference in March. Yet the reassurance that the PS4’s first year would see a number of less-mainstream titles from a company also responsible for publishing games like Killzone – from whose name alone you can probably guess the depths of its imagination – was nevertheless refreshing.
Sony’s booth was populated with creative-minded titles, something the industry desperately needs at its heavily-advertised forefront where “build your own game” sandboxes like LittleBigPlanet and the Xbox One’s similar Project Spark aren’t enough. Other than PS4’s efforts and Indiecade’s tiny oasis in the corner of South Hall, walking around the show floor might not give you any indication games could be anything other than shooting terrorists and players teabagging each other. E3 is a truncheon to progress.
For the self-serving bottom line, 2013 was – and in most ways should have been – a banner year for the Electronic Entertainment Expo. Consumers want to know about next-gen hardware. They want to know what they’ll be playing when they get a PS4 or Xbox One, to see what these allegedly new game experiences are capable of.
I’m interested in next-gen tech, too, though my curiosity involves pushing it beyond the surface level titillation of multi-million dollar videogames that look one step removed from CG. What might developers do with that power? The potential implications next-gen could have on artificial intelligence, for example, are enormous.
AI has always been something of a means to an end. It’s rare to hear PR reps talk about revolutionary steps forward with its implementation, aside from simply highlighting how this new product is a ruler’s length smarter or more adaptable than whatever it was in the last title the developers worked on. After a while, videogames become a series of systems laid bare for any observant player to see: a network of if/thens, branching paths and executable decisions based on user input. It has been a long time since the grunts in 1998’s original Half-Life worked together to outwit and outflank you, a frightening prospect back then that revolutionized the sophistication capable from first-person shooters.
I wouldn’t have expected Microsoft to be the one company seeming to push AI advancement. They took a lot of shit over their seemingly ironclad policy on a required continuous internet connection (“Fortunately, we have a have a product for people who aren’t able to get some form of connectivity,” smirked then-Microsoft president of interactive entertainment business Don Mattrick in an interview during the show. “It’s called the Xbox 360.”), the cloud usage of which could be used in any number of ways. Surprisingly, Forza 5 seemed to be leading the pack for AI by allocating just that.
Rather than resorting to typical AI, which in racing games usually means a sliding scale of aggressive-to-unfair behavior from rival computer-controlled drivers, Forza 5 uses the ridiculously named “Driveatar” system, which replaces AI with actual data from other players connected through a cloud server. Every player in Forza has a Driveatar that continues to drive around (whether you’re actively playing the game at the moment or not) gathering data in your absence. Instead of driving against computer opponents when you take control, you’re driving against other Driveatars. Though I’m not totally sold on Forza 5 as the “end of AI” as MS handily touted it, you take these sorts of things with a grain of salt. It’s certainly an interesting prospect.
Or, it might have been. In the weeks following E3, Microsoft reversed nearly all contentious policies for the Xbox One – no more draconian restrictions on used games that would limit the number of friends you could lend a disc to (or the number of times a game could be so lent), no publishers determining how many consoles a game could run on, no activation fees if someone wanted to play a used game. Rules for reselling used games are no longer in the hands of publishers, either, and you don’t have to be on someone’s Xbox Live friends list in order to borrow a game from them. In addition, with the always-online connection and 24-hour online check-in requirements abolished, plans developers may have had to utilize the Xbox One’s cloud storage to free up memory for weird experimentations like Forza 5’s Driveatar may no longer be applicable.
The degree to which MS is now backpedaling away from their crazier notions for the cloud is anyone’s guess, but considering the way major videogames are viewed only as a business these days, I wouldn’t put much stock in a robust plan for online innovation.
In general, AI wasn’t a major concern for companies with next-gen titles on display. To be fair, it’s hard to judge what you see in an E3 presentation as totally representative of what a game may actually end up being. These are carefully controlled exhibitions, sliced out to give you a general impression of whatever they’re showing off, an information drip that lasts from one conference or media event to the next. It takes any developer awhile – as in, a number of years, usually – to learn and gain control over the programming parameters of new hardware. But I’d be lying if I said it wasn’t a little disappointing to see the future of videogames as depicted solely by sexier graphics. Shooters now had newly intense particle effects but stealth games still sported rudimentary sentry paths. Not exactly a vast technological sea change.
Admittedly I will probably play Battlefield 4 when it releases, because everyone who plays games likes indulging in the shallow joy of sexier graphics every once in a great while. I doubt I’ll pick up another military shooter, though, until someone makes the videogame equivalent of War, Sebastian Junger’s sobering account of 15 months spent embedded with a platoon of American troops in Afghanistan’s hostile Korengal valley. Maybe the Call of Duty dog, as wonderful as its existence may be, is the signal that the modern military genre has jumped the shark and that it’s time to find some new trendy bandwagon to jump on and eventually get sick of.
Already it seems that trend may be the idea of the open-world, with go-anywhere freedom quickly in danger of becoming the go-to colloquialism for so-called next-gen design. I guess if improved AI just equates to smarter soldiers that flank you that much harder, it may not be such a big loss anyway.
“Write whatever you want.”
That’s my editor, Medium’s own Stu Horvath, telling me what he’s looking for before the show has begun. Stu took ill and had to cancel his trip to the West coast and, as a result, I’ve sniped nearly all of his appointments. This puts me in the enviable position of being a freelance writer with nowhere to go but where I want to go. In an assignment driven business, this is almost a vacation.
In between appointments, I sit in the pressroom in West Hall and think about what E3 actually is this year versus what I thought its potential might be. This year is weird because E3 has become irrevocably weird. In the days before live streamed announcements and smartphones and 24 hour news cycles that have marginalized the impact of all news announcements, E3 was the time where everyone pulled out all the stops to really try to surprise fans and competition alike.
When then-SCEA president Kaz Hirai unveiled the PlayStation 3 in 2005, the insane price point of “Five hundred and ninety-nine US dollars!” was certainly a shock. This year, Xbox One’s price tag was revealed at $499 – $100 more than the PS4 – and it almost felt like a side note, quickly forgotten in an hour-long barrage of videogame trailers and presentations.
Those economics aren’t off, at least not in terms of expectation (though the backlash and derision aimed at a $500 Xbox was palpable). The industry’s growth throughout this generation has not always been for the better. Money has twisted the top-heavy upper-echelons toward loud incoherence, like a well-used San Fernando porn star going through familiar motions because they reliably pay out. Unlike past E3s, few titles among 2013’s original Triple-A crop had much originality to them.
I did play a handful of fun current-gen games. The next 3D incarnation of Castlevania was pleasantly almost exactly how I pictured a sequel to the underrated Lords of Shadow; Dark Souls II was crueler and more dynamic than its predecessor, with impressively improved visuals; Suda 51’s Killer is Dead delivered an engaging aesthetic experiment in violence and color – like a more graphic El Shaddai. In motion, it reminds me of cherry blossoms: the frenzied swordplay of your cyborg executioner avatar surrounded by a near constant bouquet of curdled plumes of pink and purple and red that shift and unfurl so thick and fast with each blade swipe it’s hard to keep track of the action on screen.
A handful of non-playable presentations also proved entertaining, though what I enjoyed at the show failed to move me much. The problem was nothing stuck with me. I didn’t even make any real appearances at the nightly open-bar E3 party scene that publishers and hardware companies put on for show attendees. I grasped at straws countless times in front of my computer, Word’s cursor blankly blinking at me.
I ended up frequently walking around downtown LA. In the absence of parties I waited until the pressroom closed, usually grabbing dinner somewhere nearby with a handful of friends that seemed to change every night. We sat around tables cluttered with the forgettable culinary options of LA Live, the soulless commercial drag encompassing the Staples Center just up the Figueroa promenade from the convention center, seeking out opinions on the show. At least we could all agree E3 felt off. A momentary solace.
I don’t like Los Angeles. Like all big cities, it’s dirty (I’ve noticed downtown often smells like piss) and in every part I’ve visited, it seems dirtier than the last. The long city blocks seem to stretch the distance between you and the skyscrapers that sit unmoving in the polluted air, projecting the illusion that you’re so small that even when at a constant, brisk walking speed you never actually go anywhere. No matter how close you’re staying to the LACC – with the possible exception of the Figueroa Hotel just a few blocks north – you’re never close enough to not immediately want to jump into a cab.
Not that it’s a walking city anyway. The city’s 500 square mile sprawl stretches to the proverbial horizon. If your whole existence there is, as mine has been, limited to the small Southwestern corner of downtown encompassing Figueroa, its numbered streets branching East and the occasional trip West to Hollywood, your experience is probably not ideal. During the day the heat makes getting anywhere on foot feel oppressive. The combination is suffocating.
Los Angeles at night is its own beast. Figueroa is bizarre enough, since for all the storefront restaurants and shops, the only life seems contained within its multi-lane traffic patterns. You pass people on the street but no one seems to be heading towards a destination. Those already inside the shops and bars seem only to exist within them, entire lives spent as living, immobile puppets. Even here, you can’t outrun the long marketing arm of videogames, as giant ads plaster the sides of hotels and high rises, making sure you’re well aware of what first-person shooter you have to go check out in South Hall.
Several times people have reflected to me how alien the cityscape is at night. Alienation has driven the Los Angeles narrative in innumerable forms since the postwar era and it’s hard to picture the town being anything but as mottled and filthy as it is today. Downtown doesn’t seem to be populated by actual Angelinos – I’m positive no one actually hangs out there. For the week I’m there, I’m one of thousands of out-of-towners damned to walk streets that harbor a chill that almost borders on menace. The feeling of being an outsider never dissipates.
Every night on my zigzag trek east from Figueroa to the loft where I’m staying, I walk past the famed Los Angeles Central Library, a landmark I know from L.A. Noire, now dwarfed by the columns of buildings that tower around it on the streets adjacent. It’s convenient to romanticize the isolation of the city at night, as Michael Mann so expertly did in his dingy, grainy Collateral. The most romantic thing I see in my expeditions past the library is a greasy three-inch cockroach skittering soundlessly from the blackened sidewalk into a drainage ditch.
I bluffed my way into see Deadly Premonition director Swery65’s new Xbox One title D4 on the last day, catching the second half of a 30-minute presentation that was split with an Xbox Live Arcade game so forgettable I didn’t bother writing its name down. D4 looks every bit as strange and wacky as Deadly Premonition’s Twin Peaks sensibilities would suggest: the plot involves a private detective who can go back in time and is trying to solve his wife’s murder, who is also evidently cognizant that his manipulation of the past can’t actually change the inevitable outcome of her death.
After the presentation, an outlandish 15 minutes of laugh-out-loud character interaction and weird Kinect interfacing, I asked Swery – impeccably dressed in a goofy Nipponese mod style that included designer frames, a yacht jacket, a fashionable Summer scarf and leather shoes that probably cost around 400,000 yen – how he came up with D4’s madcap premise. His answer was short and to the point.
“I have no idea,” he said to me in strong English, not without a hint of irony. “My life.”
Out of all the Xbox One games Microsoft was scheduling appointments for at their booth, I’d guess less than one percent of attendees elected to check out D4.
Swery, like Suda and several other international developers, are in many ways what make the industry bearable. The profound and subtle differences evident in games hailing from outside North America often make for more engaging premises, narratives and approaches to design. And while I can’t dispute that there’s some truth that some Japanese developers simply aren’t willing to take many financial risks, there seems to be more willingness to embrace auteurism in their work, a dichotomy I can only blame on a contrast in pop-cultural interaction.
The Evil Within, Shinji Mikami’s return to Resident Evil 4-esque survival horror, is a clear message from the man who popularized the genre (Alone in the Dark predated Resident Evil in concept by a number of years, but it failed to resonate much with a large audience). In the wake of high profile disasters like RE6 and Dead Space 3, it seems clear that the majority of the industry doesn’t know how deal with horror, action-based, psychological or otherwise.
The gameplay I see in Evil Within, a PI trying to escape a hellishly graphic asylum strewn with enough gore and monstrous imagery to fill a dozen grindhouse films, isn’t especially scary but neither is playing Fatal Frame with multiple friends in a room, an activity I unsucessfully attempted as a teenager. Mikami’s decision to break the popular horror paradigm is no less piquant, especially when measured up against the Wolfenstein sequel The New Order, paired with Evil Within’s presentation.
Wolfenstein has been getting more buzz due to its breakout adventure game narrative moments (Evil Within has been deemed “too old-school”) though the Swedish-developed game feels much more Western than it could be. I have a hard time staying in the moment whenever the Aryan-countenanced Nazi killer B.J. Blazkowicz breaks the game’s supposed moody narrative intelligence by mouthing off incongruous one-liners.
Konami wasn’t showing any more of Metal Gear Solid V at their booth than Hideo Kojima’s grim, breathtaking director’s cut trailer. One night at the loft, I parsed through it to piece together the meaning of the many red herrings and meta-misdirection Metal Gear’s ingenious director had already pulled, not the least of which was disguising the game’s reveal as another title from an unknown European developer during Spike’s VGA broadcast last December. Kojima’s everlasting imprint on the industry, that of a creative mastermind of the medium that never fails to surprise and push forward videogames as a whole, made my analytical process more rewarding than many of my experiences on the show floor had been.
The Wild Hunt, the third installment in the Witcher series and a massive hunter-and-hart fantasy tale that looks to outsize Skyrim by a considerable number of leagues, is about as close to an interactive Game of Thrones as we’re likely to ever get, outclassing any new RPG developed in the United States or Canada. The Witcher is its own series of novels by Polish writer Andrzej Sapkowski, probably a telling reason why the developers at CD Projekt Red have been able to adapt it into such a literary-feeling series: rich with politics, character interactions and a maturity that reflect deeper narrative concerns that makes Elder Scrolls look skeletal. The Wild Hunt was running on an actual PS4 debug kit able to render stunning, seamless landscapes with dynamic weather – not the most exciting use of all that tech, but through CD Projekt’s tour of the ordinary and in seeing such a realized and sophisticated and European world, I found something genuine.
Bayonetta 2 and Dragon’s Crown, a pair of wildly dissimilar Japanese brawlers, drew considerable crowds, presumably by not being stuffed into non-descript meeting rooms despite their niche appeal. Both games looked great: Bayonetta with its endless kinetic energy, Dragon’s Crown a playable HD exhibition for the painterly Western-free brushstrokes of Vanillaware’s George Kamitani. If you have any institutional memory for modern Japanese game design, you don’t really need hands-on time with these to understand them.
I couldn’t help but wonder if, while watching someone button mash their way through the Bayonetta demo, half the people lined up to play were distracted first by the incentive allure of free related swag or pictures with booth babes. In a head-to-head sales battle against high-profile Western releases, the Bayonettas and Dragon’s Crowns are just about always the trailing second place finish. As the demo player struggled with Bayonetta’s controls, I noticed the model Nintendo had hired to cosplay as Bayonetta posing nearby. I hated that I had been around long enough to observe she was far too diminutive to portray the character as the girl at Sega’s booth promoting the original game had across the hall years earlier.
It’s around 8:30 in the morning and I’m on speakerphone. On the coffee table is a set of questions I’m working through with a high-profile European developer overseas – arguably one of the most ambitious designers working today. We have similar opinions on the potential of the medium versus the philosophical space they occupy on a commercial level. As the conversation winds on, he eventually gets to the heart of why he feels the industry needs to change.
“That same old gameplay no longer interests me,” he says.
I’m some hours into the tense, nightmarish post-apocalypse of The Last of Us and it’s the first game I’ve touched since finally returning home. I’ve always admired Naughty Dog for their narrative chops and technical wizardry. I’m looking forward to seeing what progresses in the desperate, brutal world Joel and Ellie inhabit – though their story is probably the last interactive pandemic narrative I’ll feel the need to digest for the prolonged future. There are a handful of other games I’ll happily play in the coming months, for stylistic reasons or more substantive ones: Killer Is Dead, Beyond, maybe a couple PS4 launch titles and the meditative and moody indie sci-fi experiments Routine and The Swapper, if I can set aside the time to get them to run on my MacBook. I’m only interested anymore in games that make me think and challenge my brain. Maybe it’s why lately I’ve been sticking to novels.
If there was ever a time to step up and evolve, with more Journeys and Papo & Yos and less fictions where every single scenario involves you picking up a gun and murdering your way through thousands of people, it’s now.
I don’t hate the game industry, even if its machinations, so apparent in the reflection of E3 and the town that’s long borne it, are often something I find increasingly frustrating and problematic. Nor have I given up on videogames as a medium for creative expression. In fact, I’ve had more interviews talking with smart, fascinating, inspiring developers than I can count. It’s just that traits the industry deems important seem silly. We’re not allowed to deviate from that candle-lit path. Hell, PR treats breaking embargoes – the binding agreements that prevent anyone from reporting on anything from those carefully drip-fed events until a certain date – as a form of treason. All this over pieces of entertainment software. As I wrap up my interview, it’s nice to talk with someone who shares some of my ideas. Maybe I just need a break from trade shows.
But, for the most part, I think I’m ready to put down the controller and let all the acts of violence in all the big, stupid Columbias in the digital world sort themselves out without me.