ArIES, IIT Roorkee
8 min readSep 19, 2018

You might have heard of a line following robot but have you ever heard of a line following drone? Sounds interesting ….. right? Well, this is what we worked on in December 2017. We were a team of five members working with full zeal and zest on making an autonomous drone which could follow a straight line and scan the barcodes and QR codes on boxes which are placed at different heights in the warehouse. The hardest part was to make the quadcopter stable.

The Problem Statement

The primary objective was to develop a UAV which can fly indoors (with an onboard or offboard computation and tracking system).

There were two shelves of cardboard boxes with both QR codes (5 inches X 5 inches) and barcodes (2 inches height) on them. Each shelf had 2 rows of boxes (4 in each row). Some of the boxes had a hazardous symbol on them.

There was a marker line (plain yellow marker) parallel to the shelves, also small line strips perpendicular to this line facing each column.

UAV had to perform these tasks autonomously on the field:

  • Start from the Starting point.
  • Follow the line and scan all inventory boxes and update the location
  • (Barcode ID, QR Code ID, which shelf, which row, which column) real-time by showing on ground station or some other means.
  • Land on the Endpoint
Fig: Arena

Our Approach

We tried various methods to approach this problem statement. As the quadcopter had to move in GPSdenied environment we had two options-

  1. Complete automation using image processing(IP) for line following and moving the quad in GUIDED_NOGPS mode.
  2. Using optical flow sensor for stabilization and moving the quad in POSHOLD mode by giving RC Override commands along with a bit of image processing.

Both methods had their own pros and cons. The first method was complicated as it had a lot of image processing involved and also the quadcopter was not much stable. Whereas the second method which involved optical flow sensor didn’t really work on the tiles and mats because of the bad texture of the tiles and mats. Optical flow worked very well on grass. So we had to discard the second method and we went on to improve the first method.

Line Following using Image Processing

  1. capture an image.
  2. warp the image to obtain a top -down view.
  3. make a mask using HSV color space.
  4. apply region of interest band across the top and bottom of the image to give two coordinates respectively.
  5. use these coordinates to return bearing and offset to calculate yaw and roll commands.
  6. give commands in GUIDED_NOGPS mode.

Components Used

Here are the main components we used for building our quadcopter:-

  • Raspberry Pi-3 : This was our main onboard computer which communicated directly with pixhawk through mavlink protocol.
  • Picam 2.0: It is a high-quality camera which can be easily mounted on the Raspberry Pi. The Picam was used for taking video input, which was then processed and used to detect the yellow line.
  • TFmini LiDAR: A low cost unidirectional ranging LiDAR with a high-speed measurement sensor. It can accurately measure distance up to 12m.
  • PX4Flow Sensor: Optical flow sensor used for GPS denied environment, but was of no help during the final presentation.
  • PPM Encoder: The ppm encoder allows to encode up to 8 PWM (pulse width modulated) signals into one ppm(pulse position modulation) signal. Pixhawk requires the ppm signals as input from the transmitter, so the ppm encoder is used for converting the received signals into ppm signals.
  • RC: We used a 6-channel transmitter-receiver for controlling the quadcopter manually.
  • Power Module
  • BLDC motors: 4 Brushless DC motors were used each of 1000Kv rating.
  • ESC: 4 Electronic Speed Controllers were used each of 30A rating.
  • Battery:- We used a 5200mAh battery which gave a flight time of approximately 15 minutes.

Connections with Pixhawk

The connections of different components with Pixhawk are explained here using pictures.

  1. TFmini with Pixhawk to serial 4/5.

Before using TFmini, download the firmware from Benewake site and update the firmware using firmware updater.

2. Raspberry Pi 3.0 with pixhawk to telemetry 2.

In the above picture instead of connecting the red wire(5V) directly from Pixhawk, we gave 5V input to pi through BEC from one of the 4 ESCs.

3. Safety Switch and Buzzer

Connect the safety switch and the buzzer to their respective ports on Pixhawk. Also, connect the power module to the power port of Pixhawk and the other two ends of power module going to power distribution board and to the battery.

4. ESCs

Here we will be sharing the picture we usually referred for the connections of the ESCs with the Pixhawk.

Main in pixhawk has power, ground and signal railings.The upper row of the Pixhawk is ground, the middle one is 5V row and the last row is the signal row.

The signal wires from ESC of motor 1 goes to port 1(signal railing) of main out of pixhawk. ESC of motor 2 goes to port 3(signal railing). ESC of motor 3 goes to port 2(signal railing) and ESC of motor 4 goes to port 4(signal railing). Make sure to connect the +5V and ground of any 3 ESCs in the above- given way accordingly to the corresponding ports of power and ground in pixhawk.

Now +5V and ground of one of the ESC would still be remaining. This will be used to power up the raspberry pi.

4. PPM encoder

Connect the receiver to the ppm encoder. The other end will be connected to pixhawk as shown in the following image

5. Picam to Pi

Connect the Picam with the Pi. Place the Picam downfacing at an angle with the vertical.

Firmware and Parameters

We would be using both Mission Planner and QGroundControl, so download both the softwares.

We uploaded the APM standard firmware, using QGoundControl, as GUIDED_NOGPS is supported by standard firmware only.

Now calibrate the sensors of pixhawk, transmitter signals and choose the flight modes. We set GUIDED_NOGPS and STABILIZE as the flight modes which can be easily switched using the transmitter channel switch.

Set the following parameters using either Mission Planner or QGroundControl.

AHRS menu
EK2 menu
EK2 menu
Arming Menu
LiDAR Range Finder Parameters

Installing modules on Raspberry Pi

pip install dronekit

pip install dronekit-sitl

If an error occurs during installing pymavlink or mavproxy. Try this:

pip install -U pymavlink mavproxy

And in the end, we need to disable the OS control of the serial port of Pi.

Now let’s move on to establishing the mavproxy connection between Pi and pixhawk.

open up a terminal in pi and type

sudo -s

With this, you jump to the root directory, next type the following

Here the first IP 192.168.43.135 is the IP address of our Pi, and the second IP address is of our ground station(Mission Planner / PC) which is to be connected by UDP.

Now our connection has been established, open Mission Planner on the ground station(PC) and connect using UDPon port 14550.

Running the code

Open up another terminal, go to the directory where your code is and type

where “your_code.py” is the python file to be run and “192.168.43.135" is the ip address of the Pi and press enter.

Autonomous Takeoff

For autonomous takeoff, there should be zero drift, so try to balance your quadcopter as much as possible. We used the set_attitude function to send roll, pitch, yaw and throttle commands where roll, pitch, and yaw commands were calculated using Image Processing and throttle commands were generated using the height input from the LiDAR. PID algorithm was used to generate the throttle values and to reach and maintain a fixed altitude.

Line Following and QR codes scanning

After the quadcopter reached the Target Altitude, next task was to follow the yellow line which was done by using image processing, contours were detected from the input frames and a straight line was formed joining the contours, after this corresponding yaw, pitch and roll values were generated.

Also, we had to scan the QR codes too, which could be done by:-

  1. Firstly, we gave the quadcopter a very small pitch, so that when it passed through the QR codes it could scan without stopping. But it gave no results in real time. The app needed almost 0.5s to focus on the image and then scan the code. So this idea failed badly.
  2. Next, what we did was detected the junctions, quadcopter was made to hover at the junction until the QR code was scanned and then again allowed to follow the yellow line until the next junction was reached.

Autonomous Landing

This was the most satisfying part, after the yellow line was followed with scanning all the QR codes, next came the Landing point (End point), here we used the Pixhawk inbuilt Landing mode “LAND”, which smoothly landed the quadcopter.

Some Tips

  • Remember you don’t need an actual TTL to usb converter you can make one out of Arduino.
  • If you have never worked with 4 in 1 ESC before, try using 4 different ESCs as there is a risk of losing all 4 ESCs at once if any fault occurs(we damaged them all at once).
  • Use Dampener to reduce vibrations on Pixhawk.
  • LIDAR gives garbage values at a distance less than 30 cm, so don’t be scared.
  • Don’t try to arm in the GUIDED mode as the quadcopter will disarm itself after some time because the quadcopter has no GPS.
  • Don’t use dronekit inbuilt function “vehicle.simpletakeoff” as it does not work in GUIDED_NOGPS instead use “set_attitude”.
  • Commands like “condition_yaw” and “ned_velocity” doesn’t work in POSHOLD mode.
  • Calibrate level in mission planner before running code for minimum drift.
  • Be sure to make a disc image of pi, in case of any failure.
  • Not every memory card is compatible with pixhawk, be sure to check it online once.
  • We prefer using APM firmware over PX4firmware as dronekit doesn’t work properly with PX4.
ArIES, IIT Roorkee

Artificial Intelligence and Electronics Society (ArIES) is a campus group of IIT Roorkee, with a mission to solve impactful problems of society .