Towards Advanced Autonomous Drone Operations: Implementing LiDAR-based Object Localization via Raycasting, and Optimizing Parallel Search Patterns

Dave
d*classified
Published in
10 min readJul 14, 2023

Chiew Chun Jia developed a raycasting feature to localize targets in areas without Global Navigation Satellite System (GNSS) coverage, and evaluated search patterns to maximize chances on mission success within a drone’s endurance limits. Her work enhances rescue drone capabilities for challenging mission environments. Chun Jia was mentored by Perrie Lim and Jeremy Wong from DSTA’s Digital Hub and Air Systems Programme Centre. This post is an adaptation from her original work.

TL;DR: The goal of this project was to improve both object detection accuracy and drone exploration efficiency in challenging mission environments. We use raycasting to automatically localize an object using Light Detection and Ranging (LiDAR) data, and evaluate search patterns to maximize limited drone endurance.

Introduction

The increased integration of drone technology across industries has led to their growing utilization in specialized tasks like target tracking and large-scale mapping. We delve into two significant aspects related to drone autonomy, focusing on accurate object localization and the implementation of efficient exploration strategies using predefined search patterns.

I. Precision Object Localization using LiDAR

Lidar Point Cloud image from MATLAB Math Works (https://www.mathworks.com/help/vision/ref/pcshow.html)

A. LiDAR-based Localization Principle

UAVs equipped with LiDAR sensors enable advanced object localization when combined with the YOLOX object detection framework. LiDAR sensors, working on the time-of-flight principle, generate a detailed spatial representation of the drone’s environment in the form of a 3D point cloud.

The crucial component of object localization is the process of raycasting, which involves the projection of a direction vector from the LiDAR sensor towards the detected points. This facilitates the calculation of the detected object’s position relative to the drone’s coordinates.

Visualisation of Point Cloud from The Data Visualization Research Lab (https://ccom.unh.edu/vislab/projects/point_clouds/)

B. Pseudocode for Object Localization via raycasting

Each LiDAR point has a value for x, y, and z, which determines its position in the environment relative to the origin, which is the position of the LiDAR sensor.

The raycasting method involves the use of a direction vector that is projected from the coordinate of the LiDAR sensor towards the LiDAR points, such that we can approximate the closest lidar point that lies on the target and hence its position relative to the origin of the direction vector, in this case, the drone, in the specific direction.

The object localization process via raycasting can be represented as follows:

C. Visualization of Results

Testing out the above code in simulation, the code does return the right coordinate of the closest point of the object in front of the drone (a wall in the simulation).

II. Parallel Search Pattern Optimization for Autonomous Exploration

Here is an essential reference for any search and rescue robotics application: The International Aeronautical and Maritime Search and Rescue Manual (IAMSAR) Search Patterns e-manual. Selection of an optimal search pattern is critical for mission success. This selection largely depends on mission constraints and environmental characteristics. We eventually selected the parallel search pattern for its broad coverage and minimal overlap between search paths; but will unpack our selection process from a literature review in the technical annex. Parallel search had the best coverage where the drone detected the most objects placed in the mission boundary

Photo by Paul Keiffer on Unsplash

The parallel search pattern necessitates the computation of optimal spacing between each parallel track, which is equivalent to the camera’s horizontal field of view (HFOV). This is achieved using the Pinhole Camera Model and principles of trigonometry.

A. Implementing the Parallel Search

First, we determine the optimal spacing between each track, equal to the camera’s horizontal field of view (HFOV). That requires us to calculate the working distance (WD) of the camera planted on the UAV, given we only know the horizontal angular field-of-view (AFOV) of the camera. And this is where the Pinhole Camera Model comes in handy.

Reference: https://www.edmundoptics.com.sg/knowledge-center/application-notes/imaging/understanding-focal-length-and-field-of-view/

Since we know the value of AFOV, we just have to find the WD. How do we go about doing that?

In Unreal Engine, we set up a forest environment and ran the drone. We also placed a casualty vehicle (rescue target) at varying distances and positions within the forest in front of the drone’s camera. Using YOLOX object detection, we estimated the maximum distance where the object is classified accurately. This distance will be the WD.

Photo by Tobias Tullius on Unsplash
Measurement of Working Distance (WD) in Unreal Engine
Experiment results

Since the parallel track has a repetitive pattern, we can use a loop where the code can return accurate waypoint coordinates for the drone to follow. For the drone to detect the waypoints, the maximum distance between waypoints is 20m.

The drone’s initial position is at coordinate (0, 0, 0), defined as the middle of the search boundary. It will move towards the first waypoint of the search pattern in the corner of the grid before it commences the parallel search.

For a 50 by 50 grid, the code will return waypoints like this:

Code output of waypoints
Successfully simulated flight path of drone executing a parallel search pattern

CONCLUSION & PERSONAL REFLECTION

The integration of precise object localization and optimized parallel search patterns discussed in this project enhances drones’ efficiency for search and rescue missions. Future work includes hardware-in-loop tests and putting my algorithm through its paces in the field.

This was a fruitful and enjoyable internship experience, as I have learnt much from my mentors, the DH-PC4 autonomy team, the rest of DH and fellow interns. In my opinion, the main takeaway was the process of learning new things (ROS, Linux OS and its command line interface). Not only have I learnt how to code better, but also how to approach projects. I found that a head-on approach to projects is not beneficial. Instead, we should set clear aims, highlight what we know and do not know, and split the project into smaller tasks which are easier to manage.

Chun Jia (middle, in spectacles)

References:

[1] Why Ros?. ROS. (n.d.). https://www.ros.org/blog/why-ros/

[2] What is lidar? Learn How Lidar Works | Velodyne Lidar. Velodyne Lidar. (2022, June 3). https://velodynelidar.com/what-is-lidar/

[3] Hollows, G., & James, N. (n.d.). Understanding focal length and field of View. Edmund Optics. https://www.edmundoptics.com.sg/knowledge-center/application-notes/imaging/understanding-focal-length-and-field-of-view/

— —

TECHNICAL ANNEX FOR SEARCH PATTERN ANALYSIS (A SPECIAL SECTION FOR OUR FELLOW GEEKS OUT THERE)

A. Analysis of potential search patterns

Diagrams of Search Patterns from A Decision-Making Algorithm for Maritime Search and Rescue Plan (Ref: https://www.mdpi.com/2071-1050/11/7/2084)

(i) Back and Forth (Parallel and Creeping line tracks)

Reference paper hyperlink

This search pattern is widely used by drones and is preferable when the search area is large and there is little to no information on the probable location of the search target.

This pattern has the best navigational properties as they have long straight lines to follow, and few turns depending on the orientation of the drone (if the search legs are parallel to the long sides — parallel track, or short sides of the designated search area — creeping line track).

Long straight lines also mean the drone can travel at a higher speed between each turn, and thus the flight time can be lowered.

However, during turns, the drone may miss out on some data along the edges of the assigned area of interest. This can be resolved if the turns are made outside of the Area of Interest (AOI).

(ii) Sector Search

Reference paper hyperlink

Sector search is useful in scenarios where the target is difficult to detect having the advantage that it crosses the starting point of the search pattern several times from different directions. Sector search is best employed for small search areas.

(iii) Expanding Square Search

Reference paper hyperlink

Expanding Square Search has a large AOI coverage, and is hence used when uniform coverage of the search area is desired and when there is minimal information about the location of the search target, however, it takes longer to finish its path as compared to the Back and Forth Track. It also loses some coverage of the centre of the AOI, due to the sharp turns the drone will make at the start of the search pattern.

(iv) Zamboni bf-pattern

Reference paper hyperlink

This pattern is suitable when the AOI is large and is regularly (square) shaped. It mitigates the issue from the back & forth (B&F) method that leads to collection of data outside of the AOI, by having a larger turning radius. This leads to a more extensive coverage of the AOI. This pattern is suitable when the AOI is large and is regularly (square) shaped. It mitigates the issue from the B&F method that leads to collection of data outside of the AOI, by having a larger turning radius. This leads to a more extensive coverage of the AOI.

(v) Barrier Patrol pattern

Reference paper hyperlink

The barrier patrol search pattern is preferable when the object to detect is fast-moving (for instance in the sea with a strong water current). [6] This may be because the waypoints are placed along the borders of the AOI. The barrier patrol search pattern consists of 12 waypoints with a fixed distance from the centre point.

B. Searching forested areas — some considerations

  • Search pattern is too detailed → might collide with trees since the drone is flying at a low altitude
  • Search pattern is too sparse/does not observe an area multiple times → might miss out on some details that are blocked by trees at certain angles
  • Determining amount of turns, which are more costly for the aircraft to perform than straight lines and also significantly affects the quality of data gathered by fixed cameras
  • Since trees are placed at random positions around the forest, maybe search patterns with multiple waypoints could be safer → can adjust the position of waypoints, unlike using the parallel search or square search, where the path is straight most of the time, increasing the probability of colliding into trees.
  • Target may not have a specified deduced known location, due to the repetitive features of a forest

C. Down-selection for a forest search-and-rescue mission

Based on the conditions factors listed above, we can narrow down certain search patterns that satisfy the requirements for forest rescue.

Where’s Wally? Photo by Sergei A on Unsplash

Detection of a stationary object

When searching for a stationary subject, the goal is to cover as much area as possible, as fast as possible, or with the least distance covered.

Parallel search was proven to perform well in searching stationary objects in a given area, while also with the least distance travelled and similar coverage as other methods (e.g. Square). It is also proven in this paper, where all survivors were found using the parallel search method with the shortest time. Also, when the point of last contact with the target is uncertain and the AOI is large, the parallel pattern is the preferred search pattern to use, where there is an equal probability of finding the target at any spot in the AOI.

Different sizes of AOI and available information

However, for relatively smaller areas, and when the target is of a known approximate location, the square and sector search patterns can be better alternatives.

Considering the characteristics of forests, the sector search may be a better option than the square search, providing good coverage around the starting point, which may be obscured by trees. Covering the centre area where the target is approximated to be multiple times from different directions allows for a more accurate search. A square search might also lead to frequent collisions with the trees, with its extensive coverage.

Literature review on search pattern vs targets found and search length (https://ntnuopen.ntnu.no/ntnu-xmlui/handle/11250/261317 (Sections 5, 7 and 9 mainly)

Taking into account that the AOI would be split into smaller grids for search, while there is a high probability that we have little to no information on the location of the target, we can either choose the parallel search or the sector search, parallel search when the target is of an unknown location, and sector search when we know the approximate location of the target in a certain grid. The parallel method can also be implemented with the Zamboni pattern as well if we realise that the parallel track misses out on targets at the edges of the AOIs.

Since for this project, the target is not likely to be fast-moving, the barrier patrol search pattern will be less preferable than the other methods, where it detected the least stationary objects even though it has the shortest flight path.

A possible way to implement search

For the first few grids, parallel search can be done, since the target is likely to be anywhere within the AOI. As for the last grids to be searched for each drone, a sector search can be done, since if the target is not found in any of the previous grids, it is likely to be in the last grids.

References for this section:

[1] https://www.researchgate.net/publication/356210802_A_Multi-Objective_Coverage_Path_Planning_Algorithm_for_UAVs_to_Cover_Spatially_Distributed_Regions_in_Urban_Environments/download

[2] https://www.researchgate.net/publication/330147407_Survey_on_Coverage_Path_Planning_with_Unmanned_Aerial_Vehicles

[3] https://ntnuopen.ntnu.no/ntnu-xmlui/handle/11250/261317 (Sections 5, 7 and 9 mainly)

[4] https://paginas.fe.up.pt/~ee07245/wp-content/uploads/2012/04/pattern.pdf

[5] https://www.researchgate.net/publication/343267206_A_Fully_Autonomous_Search_and_Rescue_System_Using_Quadrotor_UAV

[6] https://scholar.afit.edu/cgi/viewconteSectornt.cgi?article=5304&context=etd

--

--