Games are dying a horrible Interactive Entertainment death
Gameplay in current gen AAA action games is as stale as old bread. The introduction of RPG elements into every other genre has made games duller, switching the focus from gameplay to story. People are playing them for the cutscenes, not for the gameplay itself, which usually feels repetitive, pointless and not even fun. How did it get to this?
As a race, we humans seem to have a fundamental need to entertain ourselves. Ever since we stopped living in caves, mankind has developed ways of immersing itself into games. The earliest forms of board games were found in the ruins of ancient Egypt, and date as far back as 3500 BC. There are paintings in caves from Mesopotamia that depict games being played as early as 5000 BC.
Video games could be considered very much the natural evolution of these primitive forms of pretend play, with the capability of transcending what is possible in the physical realm, not leaving as much to one's imagination as a regular board or card game would have to.
Yet, for the past decade, AAA titles seem to be trying to negate these origins, striving to become more like movies and novels, and consequently less like games. Effectively, they’re becoming nothing but Interactive Entertainment.
A Brief History of Gameplay in Video Games
Back in the Atari 2600 days, blocky shapes would count on descriptions in a manual, the game's cover art and the player imagination to send players on a safari expedition in Pitfall, a F1 race in Enduro, into space with space invaders, or into a dungeon in Adventure. An Atari game could only be great based on it's gameplay.
During the 8-bit era, we've seen the blocky shapes make way for carefully crafted pixel art. Many characters we know and love were born during then, such as Mario, Mega Man, Kirby and so many others. And yet, gameplay was still king, and remained king as the SNES and the other 16-bit consoles took over. Deep, rich, involving stories were not commonplace and reserved for RPG games, which sacrificed gameplay to double down on story.
Then came the rise of the Playstation era, the use of Full Motion Video, and the dramatic increase in storage space offered by optical (CD/DVD) media, coupled with advanced 3D graphics. It was then that video games gradually stopped being games.
But what is a game anyway?
According to French sociologist Roger Callois (thanks Wikipedia), what makes a game a game are the following properties:
- fun: the activity is chosen for its light-hearted character
- separate: it is circumscribed in time and place
- non-productive: participation does not accomplish anything useful
- governed by rules: the activity has rules that are different from everyday life
- fictitious: it is accompanied by the awareness of a different reality
- uncertain: the outcome of the activity is unforeseeable
The term game, according to 1980s game designer Chris Crawford may be defined using a series of dichotomies:
- Creative expression is art if made for its own beauty, and entertainment if made for money.
- A piece of entertainment is a plaything if it is interactive. Movies and books are cited as examples of non-interactive entertainment.
- If no goals are associated with a plaything, it is a toy. If it has goals, a plaything is a challenge.
- If a challenge has no “active agent against whom you compete,” it is a puzzle; if there is one, it is a conflict.
- Finally, if the player can only outperform the opponent, but not attack them to interfere with their performance, the conflict is a competition. (Competitions include racing and figure skating.) However, if attacks are allowed, then the conflict qualifies as a game.
So what's wrong with current-gen AAA games?
We have come to expect 40h+ of gameplay out of a full price $60 game. And yet all we do during those 40 hours is go from point A to point B with a minimal chance of failure while powering through AI that can barely be considered an active agent.
In games such as Assassin's Creed, Tomb Raider, The Last of Us, The Witcher or any other action adventure title, gameplay is the nuisance that happens between one cutscene and the next. There is no variation in outcome, you’re just experiencing a story in a semi passive way.
Contrast that with classic games such as Castlevania, Mega Man, Zelda and Mario. There were no cutscenes, there was no story, only a very short raison d'etre and epic, challenging gameplay.
Your reward at the end of a level was having experienced it, beaten it's challenge, and going further in the game. It was not a CGI cutscene with some exposition accompanied by really bad voice acting —or maybe you actually believed that wizard came from the moon?
Think of all the classic card and board games. Was there ever a story to chess? What's the plot turn in Solitaire — has the king died at the end, killed by the Ace? What war is it that we fight every time we play Risk? Did our millionaire monopolist buy 4 mansions and a yacht or has he donated it all to charity by the end of Monopoly?
A game is a means to have fun, and although it may also be a piece of art, it is not it’s primary goal to be one. While creative expression made with the intent to earn money is indeed entertainment, I'll go ahead and disagree with Mr. Crawford by saying it is not any plaything with a goal that qualifies as a challenge, much less a game.
The counter argument: coin-op days are over
Some may argue that the classic video games barely emulated the model from the arcade, where the core objective of the game would be to have you spend more quarters.
Classic game titles were usually at most 4 hours long, and according to supporters of this theory, they were only hard to make sure you got enough replay value and game time for your dollar.
Current games do not need to abide by coin-op rules as they’re not seeing an arcade anytime soon, enabling games to be 40 hours long, which according to the proponents of this theory is a valid reason for games not to be as challenging, as they’re long enough to be worth one’s money.
I’d agree with that if it was any fun cruising through those 40 hours of cutscene + interlude play we’re offered. The fact is, more often than not, it’s not. And it’s not even challenging, so why bother at all?
Blame the community?
Whose fault is it, how did we end up in such a bad state? Is it us, gamers as a collective that brought this upon ourselves? I don’t think so.
Maybe game journalists are to blame. Every review containing the phrases "bad storytelling", "no depth", or "fails to make you care about the characters" is a push towards the abyss of games becoming “Interactive Entertainment” rather than joyful playthings.
The original Super Mario Bros. would probably score something like a 6/10 in a review these days. Despite that, it’s probably the best designed, most influential video game ever. It is also one of the most fun to play games out there, and the astonishing popularity of Mario Maker serves as a testament to that. But oh, it doesn't have deep moving story. Oh, it doesn't make you care about the characters. Oh, there is no lore to explore.
Take Ori and the Blind Forest for example. During Romdo Sidequest 2 (soon to be published) we have discussed how this sort of pressure has led the designers to build up a huge expectation early in the game around it’s story, to the point it makes you feel it’s a bad game for not delivering on it. And yet, it’s a great platforming game in and on itself, and while not exactly ground breaking in any aspects, it’s a great game as well as a masterful piece of art.
The same thinking and knee jerk reaction to games caused tutorials to be deemed indispensable for the success of games in the early 2000's. And yet these days mandatory tutorials are acknowledged to be an insult to the player’s intelligence and skill.
Games such as Minecraft, RUST and Bloodborne invite you to explore the game and search for what you don’t understand, instead of boring you with hand holding and rubbing the story on your face.
You’re crazy, people are buying Interactive Entertainment!
One might argue that if what I'm saying represented the gaming collective, that would reflect in sales figures. And yet, it’s the Spunkgargleweewee of the year that is always holding that top spot, followed by the usual suspects: the yearly Assassin’s Creed title, the Far Cry, the Halo and the Witcher of the year. It’s either brainless, pointless multiplayer fragfests or Interactive Entertainment, with the odd sports title thrown in.
Well here’s a thought, what if all those sales are not reflected on the time people actually spend gaming? What if those games were bought by a fraction of the population that will either pop them into a videogame once a month for a quickie, or even perhaps keep them shrink wrapped with the intention to play them whenever “they get time”.
What if loyal gamers, who actually give a shit about the games being made were not well represented at all in either wanting “deep story” games nor in multiplayer shooters with no gameplay depth?
We’d be making the games industry as a whole a big disservice by catering to the “general audience” instead of the “core audience”. It is this core audience that grew up playing those classic games. It is members of the core audience that eventually risk it all to launch a compelling indie title. And it is members of the core audience that bring the general audience in by showing them how cool games can be.
Disclaimer and thanks
If you made it this far, thanks for reading the article, and I'd really like to read your thoughts, both in favour and against of what I wrote! Check out romdo.io for the Romdo podcast where we discuss all sorts of topics, or hit me on twitter at @alexmreis. See ya!