Please no hacks…
Why would anyone care about cheating in a video game? Twenty years ago this would have only been a point of contention for those who were attempting to play competitively either for casual fun or the limited competitive scene’s. In 2020 games and interactive media generated $139.9 billion in revenue, a significant increase where in the year 2000 industry revenue was less than $50 billion. Increased interest in PC gaming, gaming consoles and mobile platforms all have been driving factors in how lucrative the industry has become. Additionally, in the past 11 years esports has become a significant media pull and advertising mechanism for companies products. This has increased the need to create a positive and secure environment for customers to enjoy the product. People cheating in games does adversely affect the positive environment, but the scope of that topic is a discussion for another day.
As most people are now aware, many tech companies leverage their applications to collect user data either for their own product development or to provide other companies a way to reach new customers/develop their own product further. An interesting example is that in 2021 ByteDance, TikTok’s parent company, began offering a service for companies to leverage their AI recommendation algorithm to help increase visibility of their product/service. This is not necessarily a new concept, all websites that have ads use metadata to a degree, but software as a service is becoming more pervasive. Games are no different, they are software that provides entertainment as a service. While independent game developers are more traditional in approaching development from an artistic standpoint, AAA developers leverage player data for current and future products.
Where cheating has become a major problem in today’s industry, is skewing the experience of the consumer or causing issues with competitive integrity. For the vast majority of people enjoying online games, the game experience is provided by the companies servers via matchmaking. In years past there was a larger focus on player or developer run servers that were administered more carefully, but now this is mostly automated due to the scale of these operations. People exploiting the systems with 3rd party programs can go undetected if the company has not implemented their own anti-cheat software or have purchased a software package to be implemented in their game (for example Fortnite uses Easy Anti-Cheat).
This is where data collection and ethical issues arise. Most anti-cheat software needs to run at a low level of system access in order to detect nefarious applications being run on the computer. This means that the user is giving up privacy and also opening up an avenue of attack should the anti-cheat software be compromised. It leaves the consumer having to ask themselves if they really wish to give up privacy for the hopes of improved product integrity. Hackers constantly work to by-pass anti-cheat in games, so it can be frustrating for a user to give up privacy and not receive the benefits. The company offering the services of anti-cheat also have to maintain security of their software and depending on the countries they operate in have compliance with data collection laws.
Recently it was shown that game feed visual output (from a personal computer or a game console) could be sent to a separate computer that is running image recognition software trained with machine learning to identify in-game adversaries and then feed input back into the original computer or console to defeat the targets. The problem that this creates for the aforementioned anti-cheat software, is that because the cheat engine is running on a completely separate device the methods used by traditional anti-cheat are bypassed. Additionally this could be trained in similar fashion for many different games, because the training routines would involve feeding game data in, training the models on the intended gameplay mechanics to master, and then creating the necessary outputs for the desired hardware. Instead of having to program an engine for each type of game, all the hacker or company making the hack has to do is run image recognition and tailor the outbound hardware inputs. For some game types, such as strategy games, this might prove to be very difficult (although machine learning has shown to be very good at playing Dota, https://www.theverge.com/2019/4/13/18309459/openai-five-dota-2-finals-ai-bot-competition-og-e-sports-the-international-champion), but for first person shooter games it has already been shown to be effective at creating a significant competitive advantage. With large amounts of prize money on the line in esports and influencer status for content creators, this causes even more transparency problems.
Potentially the best way to fight AI, is with AI. It has been attempted in the past to use machine learning to detect people employing cheats in first person shooter games, but it was determined to not be a definitively good measure at accurately selecting people cheating. In this case false positives are very undesirable because you are punishing a consumer who may choose to never use your product again, or you are removing a potentially valuable competitor or content creator from your platform. In the past these things might not have mattered as much, but with how powerful social media has become at signal boosting products, it’s not in a companies best interest to push these individuals away from your platform. Since game companies producing high end commercial products are already relying heavily on telemetry, it only makes sense to determine how to build better back end detection to find people either using software on their personal computer or via hardware input to cheat in games. Because the data within the game (minus the data that the company has received from the consumer for purposes of holding an account or other means of data collection) could be tracked and processed for suspicious activity, this would be better for the user of the software because they would not be giving up as much privacy. Games like Valorant do this to a degree already, they not only have a kernel level anti-cheat software, but additionally they claim to have used machine learning to detect cheating in a match. In this case if a person is detected to be cheating then the match is canceled and that offender is punished. Assuming that no false positives occur this is a very good system because the other players are not punished by having to play with a person that is exploiting the game or if the offender was just removed from the game then one team would be down a player (Valorant is a 5v5 tactical shooter).
The other need for companies to utilize their telemetry and data collection of player actions to punish exploiters, is because some games have essentially entire companies using botting to acquire in-game assets that are worth real world money. In massively multiplayer games in the past people would even collect in-game currency and sell it to other players as a job because it would prove to be more lucrative than other options in developing countries. With botting (creating a script or other software to repeatedly run in-game commands to acquire items or currency) this problem has only gotten worse, either inflating in-game economies or having in-game markets manipulated. These do not directly hinder a player’s experience, but they do alter the dynamics of the game and overall are harmful to the lifespan of a product. If a player realizes there is a shortcut to obtaining coveted in-game items, then it will diminish the reward of playing the game in the intended fashion. Additionally, it is harmful to the developer or publisher because it means that a 3rd party is using their platform for monetary gain. A warehouse was found in Ukraine this year with thousands of PS4’s running FIFA to obtain in-game items on accounts that would then be turned around and sold to people. Because the in-game items in question can cost exorbitant amounts of money, this method of players obtaining these items illicitly was losing the publisher large amounts of money. It is highly likely that if EA (the publisher in question) had any level of data collection that involved evaluating trends in match inputs or just even match frequency they would have seen a problem with these accounts. It seems highly unlikely that any account would be playing a game 24/7/365 and potentially with the same opponents over and over for a period time would be a human. Additionally, once the account was sold it would be pretty clear when the playtime would probably regress to the mean that it was a sold account from botting.
There are many incentives for game companies to dig deeper into their game data. Not only to secure their products, but to get a deeper understanding of why players enjoy certain games or features. If data science is going to be used to drive main development of AAA games, then studios need to invest more into seeing why certain gameplay elements work well and how to create new ones that will put them ahead of the competition. Creating more ethically sound ways to also create a level playing field will only provide customers with incentives to keep coming back to that platform. In my next post I want to talk further about how data of player behavior can be used to create a more positive gameplay experience. See you later!
2020 Year in Review, Digital Games And Interactive Media, SuperData (Nielsen)