Harnessing Sensors and Software

A Reflection and Recap of My Internship at 99P Labs

Edward Lui
99P Labs
6 min readAug 10, 2023

--

Introduction

Over the course of my two-month internship at 99P, I engaged with a range of technologies and methodologies, with a primary focus on sensors and their integration with the Robot Operating System (ROS). My contributions were integral to the SOMEthings project, an interesting initiative in the realm of testing mobility technologies.

Robot Operating System (ROS)

At the outset, I was introduced to the Robot Operating System (ROS). ROS is a flexible framework for writing robot software, providing tools, libraries, and conventions to simplify the task of creating complex and robust robot behavior across a wide variety of robotic platforms. It served as the software foundation for the SOMEthings Project vehicles. My initial assignment involved creating a simple publisher and subscriber system within ROS, which facilitates communication between different parts of a robot’s system.

Intel Realsense D435i Depth Camera

A significant portion of my internship revolved around the Intel Realsense D435i Depth Camera, a state-of-the-art imaging device designed to capture depth data using infrared projections and sensors. The objective was to harness its object detection and depth perception capabilities to provide vehicles with critical decision-making data. I began by studying the camera’s datasheet and noting its features and capabilities. My research revealed that Intel had developed a ROS wrapper for the camera, allowing for seamless integration with robotic systems. Once set up, I fine-tuned parameters such as frames per second, resolution, and IMU to optimize its performance. Collaborating with the team, we exported the data for post-analysis. The camera’s operation is based on a set of stereoscopic infrared cameras, an RGB camera for capturing color data, and an infrared laser projector that aids in depth perception. My insights and findings on this topic were documented in a dedicated blog post.

HC-SR04 Ultrasonic Sensor

My next endeavor was the HC-SR04 Ultrasonic Sensor, a widely-used module known for its ability to measure distances using ultrasonic waves. Aimed at achieving near-distance ranging, the HC-SR04, with its capability to detect objects within a range of 2cm to 4 meters, was the chosen hardware for this task. For this, an additional component called a bidirectional level shifter is needed. The NVidia Jetson Xavier NX GPIO pins are limited to operating at 3.3 volts, and the ultrasonic sensor module needs to operate and send back data at 5 volts. To prevent damage done to the Jetson, the level shifter translates the voltage between the two voltages. This is a wiring diagram of how it is supposed to be connected:

GPIO is any GPIO pin or the GPIO pins indicated in the code, for example, pins 31 and 33. Refer to Jetson documentation for the pin board number. Software-wise, the Jetson.GPIO library is required to interface with the GPIO pins and is written to a topic called “/ultrasonic_distance” which only has a float representing the distance in centimeters.

DW1000 UWB Module

To emulate GPS functionalities, especially in confined settings where traditional GPS might be less effective, I engaged with the DW1000 UWB (Ultra-Wideband) Module. This module is renowned for its precise distance measurement capabilities using ultra-wideband radio waves. For effective positioning, the module required configuration in two roles: as ‘anchors’, which are stationary points with known locations, and as ‘tags’, which are mobile points whose locations need to be determined. Calibration of the antenna delay was crucial for accurate readings, ensuring that the time taken for signals to travel is accounted for. The environment posed challenges; objects like metal, concrete, and reflective surfaces can distort or reflect UWB signals, potentially leading to inaccuracies. To process and utilize the data from the UWB modules, integration with ROS was essential, allowing for real-time data reading and positioning display. For ease of programming and flexibility, the UWB modules were interfaced with an ESP32 Wifi/BT module, a popular microcontroller with both Wi-Fi and Bluetooth capabilities, and were programmed using the Arduino software platform. While the only data received is the distance from each anchor, simple trigonometry is used to determine the x and y coordinates. At least two anchors are needed. I then mapped the data to a 2D coordinate plane to represent the position of the tag visually using Streamsync.

Sensor Fusion

One of the intricate components of my internship was delving into sensor fusion, a technique that combines data from multiple sensors to improve the accuracy and reliability of the system’s outputs. In the context of the SOMEthings project, sensor fusion is invaluable as it can provide a more comprehensive understanding of the vehicle’s environment and its own state. Specifically, I worked on fusing data from the VESC IMU (Inertial Measurement Unit) and the camera IMU. IMUs provide data about an object’s motion and orientation, and by combining data from two different IMUs, the aim was to achieve more accurate and reliable motion tracking for the vehicle. To facilitate this fusion, I utilized the robot_localization ROS package, a tool designed for state estimation. However, I encountered challenges: the sensor fusion output seemed unresponsive to vehicle movement. This indicated potential areas of improvement, possibly in covariance (a measure of the data’s accuracy), frame_id (the reference frame for the data), or timing (synchronization of data streams). Further insights into this challenge can be found here.

Conclusion

Looking back on my internship at 99P, it was a period of intensive exploration, learning, and concrete accomplishments. I successfully integrated multiple sensors, enhancing the capabilities of the SOMEthings Project vehicles. Each task, from understanding the intricacies of ROS to diving deep into the world of sensors like the Intel Realsense D435i Depth Camera, the HC-SR04 Ultrasonic Sensor, and the DW1000 UWB Module, provided me with invaluable insights and hands-on experience. The challenges I faced, especially in areas like indoor positioning and sensor fusion, not only tested my problem-solving skills but also highlighted the complexities of developing autonomous systems. These challenges, while demanding, have paved the way for future improvements and innovations. As I conclude this chapter of my professional journey, I’m confident that the foundation laid during this internship will significantly influence the future trajectory of the SOMEthings Project and my own career.

We appreciate your interest in 99P Labs and the time you’ve taken to read our blog. If you have any queries, feedback, or are interested in potential collaborations, we invite you to get in touch with us. You can stay updated with our latest research and innovations by connecting with us on LinkedIn or Medium. Alternatively, you can reach out to us directly at research@99plabs.com. We’re always eager to engage in enriching discussions and explore new opportunities.

Thank you for your support. We look forward to connecting with you soon.

--

--