Getting started with Jetson Nano and Donkey car aka Autonomous Car

Fei Cheung
Apr 21 · 7 min read
Jetson Nano

update:I have fix a few bug/typo and upload an image. You can find the image in the bottom of the article.

Jetson Nano is a powerful and efficient single board computer made for (buzzword alert) AI on the edge. It is just USD 99 and it provides all the possibility for the Maker community to harness the power of machine learning.

I have been playing around with Donkey car for some time using Raspberry Pi. I absolutely love it and appreciate the effort from the community. I am able to train it with simple CNN but the computational power soon falls short when I add more sensors, for example, IMU, Lidar. And a computationally intensive model will not have a good framerate or even not able to run on the Pi. I need something which is more powerful but not so expensive. 😛 Something below USD 100.

left: with Lidar and IMU installed, right: Standard Donkey car settings.

And here it is, the Jetson Nano. Unlike Raspberry Pi, Jetson Nano is released just a few weeks ago and there are little tutorials and projects about it. I had a difficult time to setup the Donkey car and decided to write a (and my first😆) tutorial on how to set things up. Let’s get started.


For the Jetson Nano part, you will need a Jetson Nano, micro SD card and a wifi USB dongle

I am pretty surprised that the dev kit doesn’t come with onboard wifi and Bluetooth. I followed the advice from a tutorial from Nvidia to write the Image to SD card and buy the suggested wifi USB dongle: Edimax EW-7811Un.

All you need to do is to follow the Nvidia tutorial and boot up the device. Once you see the welcoming screen, Congratulations!

Image credit: Nivida

However… While I was testing it, the wifi keeps disconnecting every several minutes and I cannot download and install the package that I needed. I spent 2 days 😕 trying to find a solution and the following command will make life easier.

echo “blacklist rtl8192cu” | sudo tee -a /etc/modprobe.d/blacklist.conf

This disables the buggy driver and the wifi seems to return to normal. Another (better) solution is to recompile the driver, you can find the instruction here.

Start installing the package

This is an embedded system dedicated to Machine Learning. It won’t be completed without machine learning framework! You can find the following information in Nvidia forums too.

Let’s start with Tensorflow first!

sudo apt-get install python3-pip libhdf5-serial-dev hdf5-toolspip3 install --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v42 tensorflow-gpu==1.13.1+nv19.3 --user

It is going to take a long time and things will seem frozen. It took around 45 min on my machine to set things up.

Remember to test things to ensure it is properly installed. Make sure there are no error messages.

donkey@donkey-desktop:~$ python3Python 3.6.7 (default, Oct 22 2018, 11:32:17)[GCC 8.2.0] on linuxType "help", "copyright", "credits" or "license" for more information.>>> import tensorflow as tf>>>

Why not install Pytorch too?

wget https://nvidia.box.com/shared/static/veo87trfaawj5pfwuqvhl6mzc5b55fbj.whl -O torch-1.1.0a0+b457266-cp36-cp36m-linux_aarch64.whlpip3 install numpy torch-1.1.0a0+b457266-cp36-cp36m-linux_aarch64.whl

Again, test things before moving forward

donkey@donkey-desktop:~$ python3Python 3.6.7 (default, Oct 22 2018, 11:32:17)[GCC 8.2.0] on linuxType "help", "copyright", "credits" or "license" for more information.>>> import torch>>> print(torch.__version__)1.1.0a0+b457266>>> print('CUDA available: ' + str(torch.cuda.is_available()))CUDA available: True>>>

Last but not least, Keras and we will need it for the Donkey car.

pip install doesn’t work for me, so I use the following method:

sudo apt-get install python3-scipysudo apt-get install python3-keras

Test:

donkey@donkey-desktop:~$ python3Python 3.6.7 (default, Oct 22 2018, 11:32:17)[GCC 8.2.0] on linuxType "help", "copyright", "credits" or "license" for more information.>>> import kerasUsing TensorFlow backend.>>>

Software part mostly finished. Let’s go to the hardware part.


I am not going to go through the Donkey car installing procedures step by step. For newcomers, please visit here for more information. I am going to highlight some key points to make things work on Jetson Nano.

  1. PCA9685 PWM driver
  2. Camera

First, update GPIO library

Nivida had already provided a GPIO library and what amazing is it has the same API for RPi.GPIO. So almost nothing needs to be changed to port RPi library to Jetson Nano. Follow the instructions from Nivida GitHub to install the library and you can test the GPIO too.

Second, install the PCA9685 servo library for controlling the steering and throttle

pip3 install Adafruit_PCA9685

Connect the PCA9685 to the Jetson nano. You should be able to see the pin number from the silkscreen mark.

VCC <-> 3.3v
SDL <-> SDL(3)
SCL <-> SCL(5)
GND <-> GND

And as usual, test the connection.

donkey@donkey-desktop:/opt/nvidia$ sudo i2cdetect -y -r 10  1  2  3  4  5  6  7  8  9  a  b  c  d  e  f00:          -- -- -- -- -- -- -- -- -- -- -- -- --10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --20: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --40: 40 -- -- -- -- -- -- -- -- -- -- -- -- -- -- --50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --70: 70 -- -- -- -- -- -- --

Look at address 0x40. It is our PCA9685. For those who want to know more about I2C protocol, you can visit here.

To access the I2C channel, the user will need to be added to the I2C group. You will need to reboot to activate it.

sudo usermod -a -G i2c usernamesudo reboot

Check the group setting for the user:

donkey@donkey-desktop:~$ groupsi2c adm cdrom sudo audio dip video plugdev lpadmin gdm sambashare gpio

There it is.

Still with me? The next step is to set up the camera. The bad news is Pi camera 1 doesn’t work with Jetson Nano. And also the wide-angle camera I had been using for a long time.

leftmost: Pi Camera 2, left, Pi camera 2 carrier board with wide-angle camera, right old Pi wide-angle camera Rubiks for scale ;P

The purchase guide is to avoid any OV5647 chip for the camera and use the one will IMX219 chip. The IMX219 driver is pre-installed in the image.

CSI connector on Jetson nano

Open the lock, put the cable in the slot, close the lock, Done. Just to be careful of the cable orientation. You can look into the connector where the pins are facing. Power things up and everything should be fine. You can check this tutorial to see how to play with the camera.

Running face detection, the detection is not optimised since I am using wide-angle camera.

The last step: installed the Donkeycar module

I have forked the original donkey car repo and made the necessary changes to make things work. You can download the Donkey car library from my repo and start installing the package.


git clone https://github.com/feicccccccc/donkeycar.gitcd donkeycarpip3 install -e .

For those who are interested. I edit the following things.

  1. Add a new camera class
  2. Add a default bus to Actuator parts
  3. Add Int typecasting in Keras.py for a variable to make training works

After a long wait (~45min) for setting up different necessary packages, you can start creating your own car folder with the following command.

donkey@donkey-desktop:~/sandbox$ donkey createcar d2

You need to make a few changes to manag.py to use the new camera.

#from donkeycar.parts.camera import PiCamerafrom donkeycar.parts.camera import CSICamera

And also add the new camera parts to the vehicle.

#cam = PiCamera(resolution=cfg.CAMERA_RESOLUTION)    
#V.add(cam, outputs=['cam/image_array'], threaded=True)cam = CSICamera(resolution=cfg.CAMERA_RESOLUTION)
V.add(cam, outputs=['cam/image_array'], threaded=False)

Everything should work just fine to this step and you can start driving and creating your own dataset!

donkey@donkey-desktop:~/sandbox/d2$ python3 manage.py drive

You can then log into the web server and control the car.

And the most exciting part! You can train your car locally on Jetson Nano!

Training through ssh session, using Tegra X1 from Jetson Nano.

Have fun!

Controlling the car from Jetson Nano and my PS4 controller. I was too lazy to dismount the Pi 😜

Edit:

  1. While I am writing this article, I found a USB Bluetooth dongle in my trash box. I plug it in Jetson Nano and it works! And I don’t need to do any extra setting to connect it to my PS4 controller. Yay! You can use the controller following this tutorial. It is critical to get a good dataset and a proper controller will help a lot.
  2. The steering and throttle seem to jam each other. The reason behind is the power-hungry Jetson Nano and motor. The momentary current draw is too large that a significant voltage drop affects the ESC signal. There are two solutions: Use a separate power supply for servo driver or add a large capacitor to prevent voltage drop. I prefer the latter one but I don't have any cap at the moment so you can see from the photo that I cut a USB cable, solder 2 jumper wires on it, and connect it to the Servo driver.
Extra power to V+ and V- on PCA9685

3. If you want the whole SD card Image, you can find it here. The ID and password are both “donkey”.

4. You may also want to create a swap partition/file for JN since it only have 4GB of memeory.

sudo fallocate -l 6G /var/swapfilesudo chmod 600 /var/swapfilesudo mkswap /var/swapfilesudo swapon /var/swapfilesudo bash -c 'echo "/var/swapfile swap swap defaults 0 0" >> /etc/fstab'

120

120 claps
Fei Cheung

Written by

Hongkonger, Maker, Teacher. Interested in all kind of stuff, from physics to Machine Learning. Lead Engineer in 2019 CES innovation award honoree.