What Are Videogames?

Michael Thomsen
17 min readMay 29, 2017

--

When Leland Yee was sentenced to a five-year prison term in early 2016, some posters on the videogame forum NeoGAF were ready to celebrate. The former state senator from San Francisco had been convicted of felony racketeering after promising to help an undercover FBI agent buy guns from a militant group in the Philippines, an ironic turn for someone who’d spent much of legislative energy on a bill trying to protect children from violent videogames. Yee sponsored a 2005 bill that passed the California Assembly, under which the state would run an independent ratings agency for videogames and prosecute retailers for selling violent ones to minors, part of a wave of anxiety about how games might be ruining the moral character of those who play them. “I don’t know about you but there’s a comfort in knowing that the so-called morally righteous who spend years fearmongering and persecuting me for my choice of pastime are far more disgusting people than I could ever imagine being,” one person wrote. Another: “…to see this fucking guy who put himself on a pedestal and shit all over our hobby, while he was a fucking criminal thief taking advantage of his position and power the whole time? Yeah, its nice.”

Yee had believed that videogames made young children more likely to behave violently, an anachronistic idea for which there has never been any substantive evidence. Perhaps that quality is what made it irresistible to politicians and lobbyists for years. The more uncertain the evidence, the more urgent the impulse for control. Hillary Clinton picked up Yee’s narrative in 2005 and, along with co-sponsors Joe Lieberman, Tim Johnson, and Evan Bayh, introduced the Family Entertainment Protection Act, a bill that would criminalize the sale of Mature-rated games to minors at the federal level. “We need to treat violent videogames the way we treat tobacco, alcohol, and pornography,” Clinton argued before the bill was forgotten. Videogames threatened children because they made violence seem too possible. In an interview with the Los Angeles Times in 2009, after the Supreme Court had announced it would review hear arguments about his bill, Yee said he had been disturbed by “the fact that you can push a button and make certain horrific things happen. If you demonstrate to a child that you can do these things, it becomes part of their repertoire for dealing with anger…as you play these games over and over again, you become desensitized.” Apparently covert weapons sales was a less desensitizing way to deal with anger.

The shocking scenes of mass murder and dismemberment in Grand Theft Auto III, Mortal Kombat, and Postal 2 that prompted Yee’s bill seem like puppet theater a decade later, but the suspicion that videogames are corrosive has never quite gone away. In 1983, David Sudnow documented the addictive properties of early arcade games like Breakout, Defender, and Missile Command in his memoir, Pilgrim in Microworld. Tetris got its own clinical syndrome, and DOOM was the Wormtongue whispering in the ear of Eric Harris and Dylan Klebold as they planned to attack their classmates in Columbine. Counter-Strike trained Seung-Hui Cho for his mass murder at Viginia Tech, and dozens of hours spent with Call of Duty charged Adam Lanza’s mind in advance of Sandy Hook. Anders Brevik said he’d practiced aiming through a holographic sight in Call of Duty before killing 69 people at a summer camp in Norway, yet he insisted the year-long sabbatical spent playing World of Warcraft in his mother’s apartment as much as 16 hours a day, saying it was “simply a hobby.” By 2012, when National Rifle Association president Wayne LaPierre answered calls for further limiting sales of assault weapons in the days following the Sandy Hook Elementary School shooting by blaming videogames — “a callous, corrupt, and corrupting shadow industry that sells and sows violence against its own people” — the argument seemed to have exhausted itself. There was nothing left but the husk of hyperbole, as hollow as a fallen tree trunk slowly eating away its insides.

Another kind of hyperbole flourished in the rot of those violence anxieties, a utopian expectation that videogames will help humankind better appreciate the systemic nature of life and, over time, usher in an age of empathy and reason. Designer and NYU professor Eric Zimmerman described this transitory epoch as “The Ludic Century.” According to Zimmerman: “The problems the world faces today requires the kinds of thinking that gaming literacy engenders. How does the price of gas in California affect the politics of the Middle East affect the Amazon ecosystem? These problems force us to understand how the parts of a system fit together to create a complex whole with emergent effects. They require playful, innovative, trans-disciplinary thinking in which systems can be analyzed, redesigned, and transformed into something new.”

Educators have seized on this faith that games can reenergize a student’s interest in school, spawning a massive industry of game-based learning companies. A movement for using games to contribute to international development has formed around the Games for Change conference in New York, promoting the work of designers who claim to be addressing issues like global poverty and human rights by developing low cost PC and mobile games. And a new generation of artists and writers have pursued the idea of empathetic games, by forcing players to experience life from the point of view of a person from a marginalized socio-political demographic, with games like That Dragon Cancer, Cart Life, Gone Home, Depression Quest, and This War of Mine. Elsewhere, ISIS has used a modification of the military shooter ARMA III as a recruitment tool, mirroring the U.S. government’s long-running series of America’s Army games, one version of which is used as a training simulation for new soldiers and another of which is sold as commercial entertainment. News channels have mistaken game footage for warfare, with an Iranian news program showing a clip from Medal of Honor claiming it was a Hezbollah sniper fighting against ISIS troops. An Irish news station accidentally used footage from Arma 2 thinking it showed an IRA troop using weapons bought from Libya to shoot down a helicopter.

Whether videogames have, so far, defined the 21st Century, they have certainly been a Zelig figure in it, winding their way through the background and attaching themselves to any new cause or conflict. Yet, there has never been any certainty about what videogames are. MIT’s primitive conversation program ELIZA was not a game, but Yoot Saito’s Dreamcast classic, Seaman, built around communicating with an irritable half-human pet, was. The hypnotic repetitions of the Xbox 360’s Geometry Wars count, but David O’Reilly’s Mountain, about watching daylight, moonlight and various forms of weather occur to a three-dimensional mountain, didn’t. The pseudo-8-bit throwback Shovel Knight was real, but Feng Mengbo’s Long March: Restart was something else. Videogames always evoke suspicion about how easily the new can displace the real. A consumer good that combines Bronze Age mining, Industrial Age assembly lines, and 21st Century programming, the videogame is a dreamlike convolution that feels lucid and urgent in the moment and inexplicable after. They are the pastime of an amnesiac society continually reinventing its own creation myths. They are the ether we apply to the present in order to make it seem operable, unfinished and in need.

Before computers ingested play, it was a supplement to other systems of human interaction, which placed implicit limits on how and when play could take place. Johan Huizinga’s oft-cited Homo Ludens: A Study in the Play Element of Culture describes play as something “older than culture,” taking place in the “sacred spot” one discovers between work and leisure. Huizinga also acknowledged the importance of violence and play. “The two ideas often seem to blend absolutely in the archaic mind,” he wrote. “Indeed, all fighting that is bound by rules bears the formal characteristics of play by that very limitation. We can call it the most intense, the most energetic form of play and at the same time the most palpable and primitive.”

In Man, Play, and Games, the French philosopher Roger Caillois suggested that in games without rules or conflict the fictional conceit itself was a burden players were conscripted into maintaining, something that “replaces and performs the same function as do rules.” In playing with a doll, for instance, there is an implicit limit to the roleplay that involves its player accepting the doll’s limits (it’s not conscious, but nor is it a rock, it’s costume and bipedal design implicitly suggest some basic terms for player behavior). “Rules themselves create fictions,” Caillois says, and by responding to the imposition of obviously artificial limits, the player must invent a reason for sequestering herself or himself from reality, defined as whatever is ungovernable by the game’s rules.

Players are not just responsible for following the game’s rules but governing the border that separates the play space from an exterior reality. Moving play on to computers frees players from this double labor. Through a computer, play does not take place in a territory so much as a separate dimension, over which the player admits they have no power. GIFs of glitches in FIFA and Assassin’s Creed become popular jokes not just because of their unexpectedness but because they reveal how out of control the player is from the rules that govern the computer’s reality.

Before computers were machines they were people, workers charged with calculating everything from navigation routes and munitions shipments to crop yields and general store accounts. Charles Babbage’s first calculating machine in 1822 didn’t catalyze a mass-market industry for computers, and his machines were never conclusively accepted as superior to human laborers, but he made the idea of removing human life from an aspect of productive society more plausible than it had ever been. He made it thinkable that a human computer could be replaced by a machine computer for a fraction of the price, and with improvements to speed and reliability.

In 1946, the United States Army produced the first major breakthrough in generalized computing with the Electronic Numeric Integrator and Calculator, a 30 ton behemoth nicknamed the “giant brain.” ENIAC was the result of years of government-sponsored research into computational machinery to calculate artillery firing tables, and which would also be used to perform a feasibility study on the hydrogen bomb. After the ENIAC’s public unveiling the Pentagon hosted the Moore School Lectures with a select group of scientists, mathematicians, and researchers from Great Britain and the United States, many of whom would go on to build an influential array of early computers, including the Whirlwind I at MIT, which served as the foundation for the Air Force’s SAGE system for air defense, and contributed to the buildup of the department of computer science whose restless grad students would, a decade later, build Space War!, the first widely distributed videogame.

Videogames made the obscure fruits of computer research physical, adrenal — the anemic penciling of the lonesome intellect transformed into a storm of sensual wonderment, palpating our shared metaphysics in a super-reality freed from physical limits. Beneath Yee’s accusations that computer games encouraged young people to develop violent tendencies was an anxiety over their capacity for pleasure, to reengineer the rote as romantic. Games don’t just simulate a series of actions, but they make them feel a certain way. Games turn violence into work, and then make that work feel like hedonic excess, something that casually inverts the logic of capital economies, whose moral foundation depends on work being indistinguishable from punishment, an atonement for having enjoyed something in the world outside work.

When a teenage Chris Roberts thought about what he would do with his life, he found himself choosing between his two kinds of entertainment. He had grown up loving movies but had disliked the idea of having to work with big groups of people. “I think I gravitated to computers because you could do it yourself versus making a film,” he told G4TV in 2011. “You would have to have a friend, there would be acting, you’d have to get a camera, and lights…[it] involved a lot more logistics, whereas back in the old days you [could do] everything on a computer. You could draw the graphics, you could do the programming, you could do the writing, you could do the sound, so I think that’s kind of how I went down that path.”

He’d begun programming his own crude games as a teenager, and then used them as samples to get a job at Origin Systems, home of Ultima creator Richard Garriott, where he set about making the kind of games that he’d always wanted to play as a kid. Soon enough the business of keeping himself entertained took on the contours of a career. In 1990, after a few years at Origin, he produced Wing Commander. It was his first real hit, a game that sent players dog fighting through deep space, with interludes between missions filled with digital tableaus composed of close-ups shots of other characters, who casually trickle out new information about the story and upcoming missions. It was considered a breakthrough, but one in which the game’s form seemed to overshadow anything it held within. “The power of film is an actor’s face,” Hungarian director Istvan Szabo said of the close-up shot. Wing Commander’s close-ups looked like tile mosaics as much as they did human figures, but they still had the hypnotic stopping power of a film close-up. Without human actors, or a script that could make up for their absence, these interludes seemed marooned in themselves, the formal equivalent of the intimate eye contact of a few minutes spent in the dentist’s chair, the distance between one face and another hinging on a paper-thin mask to keep the breath of one from filling the mouth of the other, all while simple yes-no questions and technical commands keep things moving.

The game led to a series of add-ons and sequels, which were successful enough to make Roberts wonder if he wasn’t missing something by having chosen games over movies. In 1996 he left Origin and founded his own company, intending to develop both movies and computer games with movies in them. The spinoff studio struggled to finish its ambitious games projects and its lone film release, 1999’s adaptation of Wing Commander starring Freddie Prinze Jr. and Saffron Burrows, earned back less than a third of its budget. Microsoft bought the studio in 2000 and Roberts left shortly after to pursue moviemaking full-time. He served as producer on a series of mostly forgettable movies including Lord of War and Lucky Number Slevin. Then, in 2011, he got an idea for another game.

Star Citizen wasn’t supposed to be the most expensive crowdfunded game in history. Roberts had been inspired by the early success of Minecraft and the way the audience had accepted playing a game in an unfinished state. Looking back on his old Wing Commander games, Roberts thought there might still be a better way to get that fantasy of spaceflight to emerge from computer code now that processors were more powerful and the audience seemed to embrace the newly blurred distinction between play testing and play, between buying a finished product and making a donation to a company.

“I’m not making this game because I want to make a pretty penny,” Chris Roberts told the games website Kotaku in 2015. “I’m making this because it’s my dream space game that I’ve always wanted to play, and it feels like right now — with the combination of technology and all the people we’ve had back it so far — I can do it.” Star Citizen was first shown to the press in private hotel room demos in 2012, at a spinoff of the Game Developers Conference, GDC Online. Roberts assumed that with the help of some enthusiastic previews from the games press he’d be able to draw a big enough pool of fan donors to his website to finish a prototype, which he could continue developing in the footsteps of Minecraft.

Unlike Minecraft, however, Roberts had no intention of working alone on a game that appeared to be made from papier-mâché cubes. Instead, he designed an elaborate outsourcing mechanism as ambitious and open-ended as his game concept, a company called Cloud Imperium Games. He first contracted two digital outsourcing companies — Behaviour Interactive with offices in Montreal and Santiago, Chile; and CGBot in Austin — to help build 3D models and environments, and provide programming support. Roberts also licensed the CryEngine, a game development platform created by a German technology company with offices in Sofia, Budapest, Istanbul, Kiev, Seoul, and Shanghai. After building his simple prototype and creating a short trailer to announce the game, Roberts pointed fans to a custom website where they could donate money to fund the game’s development without having to make his case to game publishers. He raised $4.1 million on his own website and another $2.1 million on Kickstarter, taken from roughly 1.5 million fan donations.

Four years later he had raised more than $130 million to pour into his own private creative vortex. Ongoing daily investments have averaged between $20,000 and $50,000. Though the company has raised more than 20 times the sum it had originally requested, the number of people donating to the cause has remained relatively constant, with the game’s website listing 1,650,392 “Star Citizens” who helped keep development going, just 100,000 more than had come out to support the game during its announcement in 2012. Cloud Imperium opened a number of new offices around the world to scale up their development plans. Roberts moved from Austin to Santa Monica to open the company’s second office, and later offices would open in Manchester, England and Frankfurt, Germany. Roberts also subcontracted with two additional studios: the Denver-based studio Illfonic to develop a first-person shooter module for on-foot sections of the game that would supplement the space dogfights. Another company, Moon Collider in Edinburgh, began working on artificial intelligence for all of the game’s enemies and systems.

In 2014, Roberts released the game’s first module, a hangar area connected to a marketplace for ship models that player’s could buy from Cloud Imperium or trade with one another. There was little to do in this module but walk around a model of the ship that would theoretically be flyable one day, and Cloud Imperium released a series of studious advertisements for each ship mirroring the slow-motion gloss of luxury car commercials, imbuing each with a fantasy of status and lifestyle fulfillment. There was no way to actually fly these ships or test the properties each commercial promised, but the fantasy was enough. Players mirrored the game’s fundraising logic by turning this inert showroom intoa marketplace, agreeing to sell rare or desirable ships in exchange for cash transfers over PayPal.

Later that same year, Cloud Imperium launched a dogfighting module, in which a limited number of ships were available for basic space combat. Another module let people fight through waves of enemy ships, and there was another module that lets players land on a planet and explore a limited area on foot. None of the modules fit together into a single game, and there were no overarching systems that might make progress in one area meaningful in another, but the game’s unfinished state worked as a place of surplus imagination, an open cathedral of disassembled parts among which people could wander and imagine they were bonded by a common vision without ever having to say what exactly that was.

When the PlayStation 4 was released in November of 2013, thousands of the consoles experienced the “blue light of death,” a hardware error in which no video signal would reach the television. According to Sony, 0.4% of the one million consoles sold on the first day of release experienced this glitch. A post on a Chinese message board from that August was discovered by someone in an American games forum, which had claimed to be from a student at Xi’an Technological University North Institute interning at the Foxconn plant where the consoles were being manufactured. “Since Foxconn are not treating us well,” the anonymous poster claimed, “we will not treat the PS4 console well. The PS4 console we assemble can be turned on at best.”

The suggestion was an organized sabotage to protest working conditions at Foxconn. It was alleged that the students were being forced to work without pay as part of an internship program, which was later refuted by another intern in the program and a Foxconn representative, who said the students were given a monthly salary of 1,600 yuan (about $246), the same wage as any other worker. Another intern refuted the original claim and pointed toward the 20,000 migrant workers from the southern province of Guizhou and the coastal city of Yantai who had been working in the factory during the alleged incident.

While it remains uncertain whether the PS4’s glitch was the result of sabotage, it’s clear that many at the plant had reason to be unhappy. A 2012 study of labor conditions at Foxconn by Hong Kong Polytechnic University’s Pun Ngai and University of London’s Jenny Chan found sabotage was a common form of resistance to unfair impositions by shift leaders and factory managers. There were also coordinated efforts to slow the pace of assembly to protest production quotas, and full work stoppages timed to special rush orders that required increased output. And though salaries have been raised incrementally, management has offset each increase with a demands for more output. One worker in a Shenzhen factory that assembled cell phone cases claimed that after a pay increase, daily output went up by 20%, from 5,120 pieces per day to 6,400. “We’re completely exhausted,” he told Ngai and Chan.

Variations on these kinds of labor pressures are reflected throughout the games industry. Members of Cloud Imperium’s art team alleged they were being made to work between 60 and 80 hours a week, and the UK team resigned in protest in 2015, fearing that this schedule, which they’d accepted as a kind of temporary crunch to finish press demos or trial modules for early supporters, had simply become normal for a game that had no discernible end point. Roberts disputed the claim that anyone was being made to work such long hours, but with development distributed across five major organizations with offices all over the world, there’s no way to ensure workers aren’t being taken advantage of. You just have to believe it’s true and keep going.

If it’s hard to define videogames, it’s even harder to say what the videogame industry is. From the tantalum mines in the Democratic Republic of Congo and Ethiopia whose product helps regulate electrical currents in game consoles, to the processing plants in Kazakhstan where the raw material is turned into something that can be affixed to a motherboard, to the assembly plants in China, to the shipping companies that move such low margin components across the globe, to the contract workers who mass produce digital art at outsourcing companies and QA testers who work on seasonal basis for low hourly wages, to underpaid retail workers at Wal-Mart, Best Buy and GameStop who help sell it all — videogames distill such a wide array of exploitation and indifference to other lives, it’s hard to imagine them as anything other than a polygonal marine spraying bulletfire toward distant enemies spawning on the horizon,the detached comedy of a Sim being led through a slapstick pantomime of a life, or the hungry Pac-Man-sized mouth happily trying to consume everything on the screen.

“We can’t allow for individualism,” Roberts told Kotaku in 2016. “On a smaller team, maybe you can have that brilliant person that doesn’t play ball with everyone else because they just do their thing and that’s all they do. On a team of our scale there is no aspect of this project where you will be working just by yourself, you have to be able to play well with others… what you want is people who are positive, can do, and want to work as a team, not always trying to go ‘This won’t work’ or ‘It’s about me’.”

Pastimes turned into industries are cursed to always have to outrun their own triviality. Pleasure is derived from waste; eventually any industry that claims to produce pleasure must account for the things it wants to waste — lives, landscapes, imaginations. Speaking to the games website USGamer in 2014, Roberts described his biggest fear about Star Citizen, anxious that the grand illusion might begin to bore the same people who’d helped fill it with power. “The risk of failure,” he said, “is the public support ends before I can get it to the point where everyone thinks it’s great.” The longer Star Citizen remains unfinished, the greater its chances of becoming the perfect game. It will have to be for how much has been put into it, and for how much room there is for still more.

--

--