A Short History of Multiplayer Networking

Gone are the days of dragging your computer and peripherals to your friends house just to challenge them to the game of your choice. Nowadays, gamers are able to boot up their PC’s and connect with other users around the world without realising the immense amount of design and engineering that has gone into creating what we now take for granted - fast paced multiplayer action at our fingertips.

But how did we get to this stage? Multiplayer gaming certainly didn’t just appear out of thin air — in fact, quite far from it.

Ain’t no party like a LAN club party. // Source: NerdyBunnee

Networking for multiplayer games, as with any networking, has to be reliable, fast and useful. Originally, multiplayer games that were played across numerous machines handled networking in a very primitive way, using peer-to-peer methods. Peer-to-peer networking is designed so that every connected player plays a part of controlling the game’s state. Whilst this may seem like a good idea, it can lead to some difficult issues that can be detrimental to gameplay, such as a connected client losing sync with the other players. Some peer-to-peer multiplayer networks use the lockstep protocol to ensure that every connected player has made a move before determining the outcome of the next game tick. This can cause major issues, as every client’s game is restricted to run only as fast as the client with the slowest connection. This prevents the game from running fast, therefore failing one of the main aims of a networked server. However, this was the only solution that multiplayer games had at the time and was therefore better than nothing.

Luckily, that wasn’t good enough for John Carmack and the rest of id software. In 1996 id software released Quake, a fast paced multiplayer first-person shooter game that featured a completely new approach to networking - the client-server model. This new method completely changed the way multiplayer games were handled, by introducing another computer that acted as the ruler of the game itself. That computer, or server, would be connected to by all players wanting to play. Players computers, also known as clients, would then send any keypresses from the player to the game server. The server accepts all input data, calculates how it affects the game state and relays all of this information to the relevant connected clients. Using a server as the central hub to connect all players is great in many ways, as it allows only 1 instance of the game to exist (rather than the game existing on each players computer) - this prevents 2 computers falling out of sync. It also allows the game to continue without waiting for the slowest players’ connection to send data, allowing players with stable connections to enjoy the game without delay. Sounds great right? Well, it’s not perfect just yet…

“Did I leave the oven on?” // Game: Counter-Strike Global Offensive // Source: pcgamesn

The problem with the basic client-server model is that each client’s game will only react when it has received information back from the server. A user would press a key and then have to wait until the data had been sent and received by the server, making it almost impossible to react to the game in necessary time.

This all changed with the introduction of client-side prediction in Duke Nukem 3D and QuakeWorld in 1996. Client-side prediction meant that clients computers could locally predict how their input would affect the game state. The client computer would still send its input data to the server, however the client itself would try and interpret what the server will respond with and display the new game state appropriately. This meant that players with low connection speeds would not be able to notice the time it took for the server to respond with their requests, resulting in smooth gameplay.

This method is still used for most multiplayer networked games today, which demonstrates the immense scalability and application it had, even back in 1996. Although there have been small improvements and additions to the method’s workings (eg. lag compensation, entity interpolation), the majority of the process remains the same. And without the likes of QuakeWorld and Duke Nukem 3D introducing the client-server model, we may have never known the importance of well engineered networking in multiplayer games.