# Fhourstone: A Complex Connect-4! Solving Benchmark

This article aims to educate and provide the reader with an overview of Connect-4!, the Fhourstone benchmark, and how it leverages CPU performance. The author does not claim any ownership of the products/devices used in this article.

# The Game

You are probably familiar with the board game shown below from your childhood. It has many different names such as Four in a Row, Four in a Line, Drop Four, but most commonly, it’s known as **Connect-4!**. Connect-4! is a two-player game in which the players take turns dropping colored discs from the top into a 7 by 6 vertically suspended grid. A winner is declared when one player manages to form a horizontal, vertical, or diagonal line using his colored discs [1].

# The Benchmark

**Fhourstone** is a non-synthetic integer benchmark inspired by the problem of solving the complex Connect-4! Game. The benchmark hinges on the capability of computing devices **to solve the entire game by searching and calculating as many game positions as possible per unit of time** using an Alpha-Beta enhanced minimax search algorithm with history heuristic. The conception of this benchmark is probably due to the fact that there are 4,531,985,219,092 possible game positions that can be made while playing [2].

The Fhourstone benchmark was created by **John Tromp** from 1996 to 2008. The benchmark is available for Java, C, and Haskell programming languages. You can try them out from his Github page here.

# The Setup

Since Fhourstone measures how fast your CPU can search for game positions per unit of time or *Kpos/sec. *You can test various game positions of increasing complexity as your test cases. The idea is that the lesser spaces there are in your board, in this case, a 7x6 board, the more complex it becomes. The images below show various current game positions of increasing complexity that you can try with the benchmark:

You can try out or create your own game position in Connect-4! and have it solved by the Fhourstone benchmark using this simulation from Game Solver found here. Just input your current game position pattern at the URL after the ‘=’ sign or as instructed below.

In the original webpage of the benchmark, you are given sample game positions to try out. We will be using the provided test cases as the official ones in the next sections. Thanks, John!

# Testing Fhourstone

The following tables detail the system and processor specifications. Three computers were selected with varying specifications in terms of model/brand and processor model as testing devices.

# Analyzing Fhourstone Results

The output of the benchmark when you run it with your command line or IDE is somewhere similar below. The game position is indicated in the first line and the succeeding lines indicate the number of turns left until the game is solved. On the second to the last line, you’ll see the familiar *Kpos/sec* unit of time which signals the overall score of your CPU in terms of completing the benchmark. According to the original webpage, **1 Fhourstone is equal to 1000 game positions searched**.

The table below details the results of running Fhourstone to three different computers. From the results, it can be noted a similar increasing pattern in the averaged thousand positions searched per second (*Kpos/sec*) for all devices as the complexity of the test cases increases, save a small outlier in PC2 for Test Case 4. We can infer that the reason for this is the test cases are arranged with increasing complexity. Therefore, an initial observation can be made where the complexity of the test cases increases so does the number of possible positions that can be searched because of the lesser slots currently occupied by the players. In the situation of Test Case 4, **there are no current game positions and the board is practically empty**, therefore, the Alpha-Beta searcher with History Heuristic implemented in the Fhourstone benchmark has more ‘space’ in the 7x6 board, thus obtaining the most game positions searched per second for the whole experimental setup.

# The Algorithm Behind It

The engine that fuels the Fhourstone benchmark is the **Alpha-Beta search algorithm with history heuristic**. You can read the whole research paper of the algorithm by **Jonathan Schaeffer **with this link. Basically, the Alpha-Beta search or Alpha-Beta pruning is a search algorithm commonly used to evaluate two-player games such as Chess, Go, and of course, Connect-4!. The algorithm stops evaluating a move when it identifies that it will not be contributive towards winning the game.

When the Alpha-Beta search is applied on a minimax tree (a representation of a current game position in a mathematical space), it removes or disregards game movements in the form of nodes and edges connect to succeeding nodes which will not be contributive towards the goal of the game. Fhourstone repeats the application of the Alpha-Beta search on its game tree (don’t bother looking for this, it’s just a representation) until one of its players finishes the game or there are no more spaces left on the board.

**Give a clap or two!**

# References:

[1] Connect Four. https://en.wikipedia.org/wiki/Connect_Four.

[2] Number of legal 7 X 6 Connect-Four positions after n plies, Online Encyclopedia of Integer Sequences

[3] Schaeffer, J. (1989). The history heuristic and alpha-beta search enhancements in practice. IEEE Transactions on Pattern Analysis & Machine Intelligence, (11), 1203–1212.

[4] Russell, Stuart J.; Norvig, Peter (2010). *Artificial Intelligence: A Modern Approach* (3rd ed.). Upper Saddle River, New Jersey: Pearson Education, Inc. p. 167. ISBN 978–0–13–604259–4.

[5] Alpha-Beta pruning. https://en.wikipedia.org/wiki/Alpha%E2%80%93beta_pruning