About a year ago, I heard about this amazing project called Donkey Car, where a camera is used in conjunction with a Raspberry Pi-controlled car chassis to train an AI to drive the car autonomously. The project combines many of my favorite things, including RC, Raspberry Pi, cameras, hardware hacking; and a few I would like to learn, such as machine learning. However, believe it or not, I didn’t have enough SD cards to flash all the images for different projects, so I had to wait. Recently, I bought 8 64GB microSD cards and have thus decided to start the project.
In the original configuration, the Donkey Car uses a Raspberry Pi to collect the video, steering servo, and throttle data, which is then copied onto a second computer to do the training of the neural net. This is quite possibly the most straightforward way to proceed. Recently, Nvidia released the Jetson Nano dev board, which has the potential (if not the practicality) of performing the training on the board itself. This is a much more convenient process, but likely to be too slow. Nevertheless, I bought myself a Nano and will test it out.
I also wanted to deviate from the original design in the following way. In the original build, the car is controlled via joysticks, keyboards, gamepads, etc, through the Pi. Although this should work reasonably well, I also know that it is unlikely to be as intuitive or responsive as driving with the RC remote. So, instead of controlling using the Pi, I will use the Pi in ‘slave’ mode, listening in on the throttle and servo settings as it passes through the receiver into the vehicle. This will likely require some changes to the code as well as hardware, and I will be detailing these in the coming posts.
I am pretty excited about this project, and hope that it will be a success. Cheers.