How I Built a Self Driving Model Car

Part 2 — The Software and Learning the First Model

Alistair Mclean
Geek Culture
5 min readJun 30, 2021

--

Previous posts in this series can be found here.

Installing the Software

I like to use ssh to log onto the Jetson and then use byobu with tmux_resurrect installed to allow me to run multiple windows within the putty console and to restore from where I left off before I last shut the car down.

VirtualEnv

On the Jetson, first of all install VirtualEnv and create a DonkeyCar environment. Activate this and do everything in this environment. I use virtualenvwrapper from here.

DonkeyCar

The first and main piece of software to install is from DonkeyCar. For the Nano there is a small config change depending on your version of JetPack. The DonkeyCar distribution comes with a script to install for JetPack 4.4 but I had 4.5. Go to the install/nano folder and create a new install-jp45.sh and change the relevant line :

You can then run pip install -e .[nano]

Calibrating

DonkeyCar comes with a calibrate utility that allows you to determine the max, min and centre or zero positions for your car. First of all its worth checking that the PCA9865 bus is configured correctly.

which will output something like this. It says the PCA9865 can be addressed through 0x40 and 0x70. The first is the default and should be already configured. Look for the following in myconfig.py which says we’re using bus number 1 on 0x40.

You then run the calibration tool and enter values manually into the console and observe the steering and throttle (make sure the car is off the ground!). By iterating and gradually increasing/decreasing the values you find where the thresholds are. These go into myconfig.py in mycar.

which for example may result in

JoyStick

I had an existing Steam Controller and wanted to use this to get me going. Unfortunately this isn’t one of the standard configs in DonkeyCar so I had to create my own using the custom joystick utility that comes with DonkeyCar. However the Steam Controller requires the PyGameController and some other bits and pieces to work.

Then get the steamcontroller code from here. I cloned it and ran the setup and then added the steamcontroller rules in /etc/udev/rules.d. You’ll need the user that runs it to be in the games group. Now we can run :

and you’ll see js0 appear in /dev/input. This is the default location for DonkeyCar to look for a controller.

Now run the joystick wizard :

This will write out a python file but the class is configured to extend JoystickController. We need to edit it so that it extends PyGameController :

Next we need to get DonkeyCar to use the CustomController which means editing myconfig so that it the CustomController is created :

And in myconfig.py set the controller type to custom :

I had problems with this controller though which was that it was difficult to control the car easily and sometimes there was a lot of lag or the controls were unresponsive and the bluetooth range wasn’t great. I also tried one of the ‘standard’ controllers, the Logitech F710, however the range was even worse (a few meters at best).

These controllers were ok for the first indoor prototype although I had to follow the car around so I wasn’t out of range, but they weren’t going to be good enough for the next version.

OpenCV

I wanted to do some image preprocessing so I installed openCV which means building it. This requires the additional swap space from the first article and I then used this. Note this built a gpu enabled version of OpenCV although you have to change your code to use Cuda. It also works on the cpu without change and so this is what I started with.

Learning the First Model

After driving the car around inside and collecting around 8000 images I was ready to learn a model. I had access to an Nvidia 1080Ti so I decided to learn the models on that machine and then copy the model back to the car.

I started with 480x640 images and I wanted to use the Nvidia architecture from here. Of course this meant adding a custom neural architecture into the DonkeyCar keras.py file and updating the utils.py to select that architecture if the passed in model type was nvidia.

In keras.py :

In utils.py :

And the train command :

This resulted in the following loss curve :

which was relatively high loss. In any case I downloaded the model onto the car and tried it. The car occasionally appeared to recognise a road edge but most of the time it simply drove over it.

What’s Next?

In the next article I’ll look into how OpenCV can be used for image preprocessing to help the training, along with finding bugs in inference pipeline.

--

--

Alistair Mclean
Geek Culture

An innovator, problem solver, analyst, creator, programmer, architect, researcher, scientist, coordinator, advisor, executive, father, gamer and mountain biker.