How Much Should Working Conditions Affect How We See Video Games?

If a game looks great, plays great, and is a joy to experience, but also put its developers through hell, how should we react?

Thomas Jenkins
The Coastline is Quiet
3 min readOct 15, 2018

--

Red Dead Redemption 2 comes out a week from Friday, and it might be the most anticipated game of 2018. We could quibble over whether that honor might below to Spider-Man, God of War, or Super Smash Bros. Ultimate, but Rockstar’s latest entry has built a buzz of anticipation that seemingly grows louder with every passing hour. The previews are resoundingly positive and every new detail that comes out makes the game seem even more interesting.

In a recent article from Vulture, Red Dead’s creators revealed that they have been working 100-hour weeks to complete the game, from writing, to edits, to any number of fixes and tweaks. In all honesty, this isn’t much of a surprise, given the game’s ambitious scope and the studio’s attention to detail. However, it does raise the question: should we as consumers take this into account when we formulate our opinions about this or any video game?

The concept of working as hard and as long as possible to finish a video game is common. Many developers call it “crunch,” and it is a driving force behind the success of games like The Witcher 3, Uncharted 4, and any other number of big-budget, AAA titles. As commonplace as it may be though, there are serious ethical problems with asking employees to work tens of hours of unpaid overtime every week, and sacrifice time with their families and friends.

Jason Schreier, for Kotaku, has written on this subject in great detail. The best examples are in his book: Blood, Sweat, and Pixels, where he uncovers the development history behind some of the biggest and most successful games of the last few years. He has also vocally called for unions for video game developers, a call that many other journalists have taken up as well. For anyone interested in learning more, I highly recommend his book and his work at Kotaku.

But for consumers, the only meaningful choice that can be made is whether or not to buy the game. One could make the argument that consumers should stop buying any game that seems like it may have been developed under “crunch” conditions, but this would immediately rule out some of the biggest and best games of recent memory. In addition, it shouldn’t fall to the consumers to fix the problems caused by studios that have unrealistic working expectations for their employees. So while I applaud people who boycott games made like this, I don’t expect it of anyone.

If the question is, should we take working conditions into consideration when formulating opinions about the biggest and best video games, the answer is yes. But to the more difficult question, to what degree, I have no solution. I still plan on buying the game — after reading a few reviews — so maybe these last few hundred words are a bunch of nothing. Perhaps my small blog post that a few people will read is an indicator that more people think like this.

But I do think this is clear: crunch is a problem for video games, and current practices are unsustainable. There are better jobs out there for developers, engineers, writers, and actors. Eventually, if nothing else, the talent drain will almost surely force some kind of change.

--

--