A New Perspective on Road Safety

Redefining the mainstream bike helmet

With the advent of autonomous technology and automatic safety measures for automobiles, there’s no doubt that safety on the road is steadily improving. Technology and hardware are improving on a day-to-day basis, which has rapidly made our driving experience more safe and less worrisome.

With this in mind, Blake Karwoski, Juliang Li, Sam Grayson and I decided to build a safety system for unpowered modes of transportation, such as bike, scooters and skateboards, by building a sensor-enabled, IOT helmet. At HackRice, an annual 36 Hour project event at Rice University, we successfully built a working model, appropriately named AHEAD.

Here’s how we did it:

Luckily we thought of the idea a few days before the hackathon, which gave us enough time to plan out and order the necessary hardware for the project. We had a few features in mind such as turn signals, brake lights, and rear signaling. We ordered parts accordingly and left the rest for the hackathon itself!

Hardware Implementation:

The original plan was to host all of our electronics (sensors, lights, etc.) on the Raspberry Pi Zero W, a bluetooth and WiFi enabled micro-computer. After a few hours at the event, however, we realized that the board didn’t support some our sensors’ protocols. Luckily, we had an old Arduino Uno to spare, so we decided to host some of our sensors along with all of our LEDs on it. By the end of the event, our raspberry pi was receiving and filtering gyroscope and accelerometer data and communicating with our Arduino over serial. Below is a basic schematic of our hardware setup along with a few of the protocols we used for communication:

Hardware Schematic

To communicate between processors and sensors, we used i2c and analog polling, and to communicate between the Arduino and the Raspberry pi, we took advantage of serial communication over USB. Further, we wanted reliable, off-the-shelf power regulation, so we used a standard 5V 4.82A power bank (not shown in the diagram).

Sensor Overview:

As discussed above, we used the following sensors and components in the helmet:

  • Lidar optical distance sensor
  • Ambient Light Detector
  • 6 DOF Inertial Measurement Unit (IMU)
  • Addressable LED Lights

The IMU, which is essentially a breakout board that includes an onboard processor, accelerometer, and gyroscope, was hosted on the Raspberry Pi. The rest of our sensors and electronics were controlled by the Arduino. The reasoning behind this was that all of our heavy computation for filtering data would happen on the raspberry pi (a micro-computer with its own memory and operating system), and simpler polling and signaling would happen on the Arduino. The photo below shows the placement of electronics for our first iteration:

Data Collection & Sampling

From a software standpoint, we planned to remotely login to the raspberry pi and pull from a git repository as needed. In this way, we were able to work on the backend/polling on our local machines before code was tested on the micro-computer. This also allowed those working on hardware to focus, without interruption, for extended periods of time.

As mentioned earlier, the Arduino itself hosted several sensors as well as LEDs. Therefore, we were able to easily implement logic to turn on lights based on sensors that were hosted on the Arduino itself. The difficult part, however, was communicating between the Raspberry pi and Arduino to relay IMU data.

We quickly realized the importance of synchronizing access to the LEDs from both the Arduino (internally) as well as the Raspberry Pi. Our solution was to make every access request to the LEDs an ‘action’. We then built a dedicated handler thread to process these actions in a “First In, First Out” basis. This ensured that the LEDs were only accessed atomically. With this architecture set up, when our sensor polling thread needed to set the LED animation, it only needed to grab the LED thread handler, queue up an action and the action would be executed asynchronously.

Features

By the end of the event, we were proud to have implemented several features that made ‘AHEAD’ stand out.

Ambient Light Triggered Headlight

We utilized an ambient light sensor to activate headlights in darker environments (tunnels, night-time, etc.).

Automatic Brake Lights

Much like the operation of a car’s brake lights, AHEAD automatically activates rear brake lights when the rider is slowing down. By using the onboard IMU to detect deceleration, the helmet can warn nearby vehicles/personnel that the rider is braking.

Rear Proximity Signal

Our system utilized a LIDAR range sensor to sense close-by objects. In the case that a car or person is behind the helmet, a siren will be triggered to warn the user as well as the approaching vehicle.

Gesture-Based Turn Signal

Using the helmet’s onboard IMU (Inertial Measurement Unit), AHEAD is able to detect left and right gestures, which activate turn signals accordingly.

Ending Remarks:

Overall, we had an awesome time working on this project, and we’re proud of what we were able to accomplish given our time constraint! Moving forward, we’re entering our product into a few competitions, and will continue to update this article as our product develops. For any other questions related to AHEAD, feel free to reach out!