Microtransactions: A Plague On Modern Gaming, or A Necessary Evil?
Microtransactions. Just the word causes many members of the gaming community to break out in irritating hives. Within gaming, a microtransaction is the exchange of real world money for in-game content. These microtransactions can take many different forms, from purchasing skins in Epic Game’s Fortnite to buying loot crates (called “reserve crates” in-game) in Call of Duty: Black Ops 4. Their pricing typically runs between $0.99 to $99, but in certain situations they can be more than that. The moral ramifications of microtransactions is not the purpose of this article, as that is a separate discussion in itself. Instead, this article seeks to answer one question nagging at the gaming community: are microtransactions necessary, or are they just a way for companies to continue to wring money out of their most loyal fans? Like with everything else in the gaming sphere, it’s complicated.
To the surprise of no one, the video game industry makes a ton of money. In total, it made $43.4 billion dollars in 2018 (an 18% increase from $36.9 billion in 2017) and is expected to continue to grow in 2019. Gaming is bigger than Hollywood, with releases like Grand Theft Auto V making $817 million dollars on its opening day. That doesn’t take into account the $6 billion dollars it’s made since its initial release. That means that one game has grossed more money than any book, film, album, or movie ever released, making it the most successful entertainment product in human history. The launch and subsequent success of Grand Theft Auto 5 can be used as a metric to measure why microtransactions are as commonplace as they are. Put simply, they’re ludicrously lucrative.
Grant Theft Auto 5 has made $500 million in microtransactions alone, and these microtransactions are nearly 100% profit for Rockstar. That’s half a billion dollars directly put into Rockstar’s pocket, simply for releasing new content for a 6 year old game and putting a price tag on it. Making games is almost incomprehensibly expensive, and it takes money to keep those games fresh and updated. So isn’t the implementing of microtransactions vital to a game’s survival? According to Michael Pachter, a digital media analyst and Head of Research for the Private Shares Group, as far as AAA content is concerned, no.
Even though it’s true that the price of video games has stayed the same for over a decade while production, development, and advertisement costs have gone up substantially in that time, initial sales and DLC releases tend to recoup profit if the game is big enough. Grant Theft Auto 5 made their $250 million budget (the most expensive video game ever developed at the time) with their preorders alone, netting nearly $600 million over their development cost at the end of the first day. Call of Duty: Black Ops 4 has an estimated development cost of $200–$250 million, making $500 million its opening 3 day weekend. The estimated revenue for that game on immediate sales alone is 400%. This isn’t including the dubious addition of microtransactions months after the game’s release (including, but not limited to, a $30 hammer.) Microtransactions within AAA titles are perceived to be nothing more than exploitation by publishers to grab as much cash as possible from their most devoted players.
Contrast that to free-to-play models — which currently dominate the mobile market — which rely on microtransactions to keep their games going and their servers updated. Take Digital Extreme’s Warframe as an example. Despite being released in 2013, Warframe has 41,509 active players on Steam as I write this. Warframe is an incredibly popular free-to-play game that has droves of people joining its servers daily. Anyone who has played Warframe will know that it is littered with microtransactions and separate in-game currencies, allowing you to purchase new blue prints, warframes, weapons, and skip the often arduous crafting timer. The secret to Warframe’s success isn’t in its microtransactions, rather it’s in the way the developer presents said microtransactions. Digital Extreme obviously cares a great deal about Warframe, and as such allows all in-game content to be unlocked just by playing it (despite the fact that it takes upwards of 1000 hours.) By making the microtransactions optional, players tend to accept their existence more easily, and thus are more likely to spend their money on in-game content.
What about smaller indie developers that don’t have the same resources as the behemoth publishers like EA and Rockstar? What does the microtransaction environment look like for those smaller companies? Simply put, they just aren’t as prevalent. A lot of indie developers sell their games for significantly cheaper than the $60 standard, but it also costs significantly less to make. This means that fewer copies of the games need to be sold in order to make back cost and — while not drowning the indie developers in money — they can make a decent amount of money back while building a loyal fan base through game play. There’s a very interesting article here that lays out how many people are involved in AAA development and the astounding amount of manpower necessary to create our favorite games compared to indie teams.
There’s an argument to be made about allowing free-to-play models to have microtransactions, especially if they are cosmetic based, but it seems like players generally disapprove of AAA developers locking their content behind paywalls. One thing is for certain, microtransactions are unfathomably lucrative, so they aren’t going anywhere anytime soon. However, gamers tend to speak with their wallets. If game companies don’t find a better way to present these paywalls, their wallets may start screaming for change with their money finding a home elsewhere in less greedy pockets.
Hello everyone! I know I’m a little late to the microtransaction party, but I still find the entire system fascinating. If you have any critiques or questions, feel free to email me at firstname.lastname@example.org. Thanks for reading and have a great day!