Been ramping up my Python skills with some real-world projects using Neural Nets. I learned to create a Neural Net in Python a few years ago from Tariq Rashid’s excellent book “Make Your Own Neural Net.” His code for teaching a Net to recognize handwritten digits really did work, but I didn’t know what to do with it after that. The usual projects have to do with categorizing irises and predicting house prices, and those didn’t interest me. I like graphical stuff and teaching my coding students to make cool art. They’re mostly little kids who love to make games, or play games rather. Sometimes you can get them to code a game they want to play, like Flappy Birds. It’s a very simple game: the goal is to make the bird fly through the gap in the pipes. Gravity pulls the bird down and the player makes the bird flap by pressing a button:

This is NOT my program. It’s just to show you how the real game looks!

Dan Shiffman posted a bunch of videos on his fun, inspiring YouTube channel Coding Train about making a Flappy Birds AI using neuroevolution. That means you make a Neural Net to be the bird’s brain and train it to play the game using a Genetic Algorithm. You set a hundred or more birds loose in the game and the birds who perform better go on to the next generation, some with their brains mutated a little or crossed with other birds’ brains to simulate reproduction. The point of Shiffman’s video was to copy the functionality of the game and not the exact look, so I made a working clone where I could make a circle bob up and down using the space bar and rectangles that would move to the left and collide with the circle.

Artificial Intelligence (or at least skill) needed

Even more dire than my gaming skills was the fact that I couldn’t use my favorite Neural Network in the game because Processing is written in Java and not Python. The “Python” mode simply translates the code to Java, then runs it, and so you can’t import commonly used libraries like numpy and so on.

I spent a few days coding my own file to do matrix operations and adapted the neural network code to use it but it turned out to be a disappointment. The birds flew, surprisingly, but they maintained their altitude and never avoided the pipes.

I followed Shiffman’s example of taking 5 inputs: the height of the bird from the ground, the velocity of the bird, the vertical heights of the upper and lower parts of the nearest upcoming pipe, and the all-important distance to that pipe. These values went in a list or array that became the inputs to the neural net. The five inputs can be seen on the left of the diagram below:

The nodes and weights of the Neural Network in the birds’ brains

The lines from the input nodes on the left to the 6 hidden nodes in the middle of the network are weights. The number contained in the input node will be multiplied by the number in the weight and the result will go into the hidden node. The final value for the hidden node will be the sum of all the numbers going into it. But there are 5 weights going into every hidden node, and there are 6 hidden nodes. That’s a lot of multiplying and adding! Good thing there’s matrices. They make doing this a breeze for a computer.

Here’s how the “signal” is propagated through the net. First the 5 input values are multiplied by the weights and are summed together in the hidden nodes:

Matrix multiplication in action, sending the inputs to the hidden nodes.

The weights are all random decimals. We’ll see how good a bird brain they make (spoiler alert: not very good) and “mutate” them in the next generation to see if we can improve the bird’s performance. From the 6 hidden nodes, there are 6 weights leading to the output node. More matrix multiplication and we have our final number. If it’s over 0.5, the bird will flap. If not, it won’t flap.

The output is well over 0.5, so the bird will flap. This process is repeated by every bird, every frame(!) until only the best performers are left. Here’s how the first few generations go:

Training flocks of birds on the course

Each generation starts off with 500 birds and most don’t make it past the first pipe. The ones that do, get saved and their brains are copied into a bunch of offspring, many of whom get mutated to see if it produces better results. Just like real life!

My friend and collaborator Paddy Gaunt suggested doing the training without the graphics would be quicker, and after seeing a bunch of generations crash and fail I was OK with that. Now all I would get is printed output of how many generations had been tested and what the highest score achieved is. Here’s an example of the printout:

The saved weights from the best bird’s brain

I saved the array representing the weights from the bird’s brain that produced such an amazing performance. There’s also a way to have numpy do that automatically, to save you having to copy and paste the array back in to the program. Once you’ve copied the brain into a bird and set it loose in the graphics again, it works wonderfully! I wanted to showcase the champion birds in a little more professional-looking environment, so I found some more authentic graphics:

Since Flappy is so well trained now, I wanted the game to look a bit more like the original!

The birds were trained on an easier course, and the pipes appear at random heights, so these all-stars aren’t simply remembering the course, they’re trained to flap according to the info they get from the environment!

Imagine using this same process to train real things like drones or robots! I fully plan to train more nets on games like Asteroids (my favorite videogame back in the days when they cost a quarter), Snake and newer games like 2048. Stay tuned!

My code is available on Github: https://github.com/hackingmath/Neural-Net

--

--

Peter Farrell

Author of Hacking Math Class with Python and Math Adventures with Python. Math is Art. Wants to introduce Math teachers and learners to Programming.