Navigation for Visually Impaired

Image for post
Image for post

Introduction

Smartcane 2.0 is a smart navigation device to help the visually impaired navigate using minimum external help. Equipped with several accessibility features, this device will help them navigate. Apart from giving navigation instructions, it warns the user if he is heading wrong and gives tactile instructions for course correction, allows him to save new route while navigating and add attributes like landmarks like footpaths, traffic signals, barriers, subway, etc while navigating.

The existing Smart Cane

Smart Cane 1.0 enabled visually impaired to detect obstacles in their path, but navigating on a path continued to be a problem yet to be solved.

The information of the complete route and additional attributes (such as footpath, subways, etc) if given to them in an accessible format (like audio, vibrations, etc), would make navigation easier.

This was our motivation to work for NaVI.

Vision

“To build a product which helps the visually impaired navigate using minimum help and so, to make them independent.”

Open House demo

Our Journey

After picking up COP 315 as a course, we were in a huge dilemma while choosing the project. We wanted to choose something where we would have to start from scratch and develop the prototype with our own ideas.So, after searching through many lists, we finally took up Smart Cane 2.0 with the aim ‘Developing a Smart Navigation System for the visually impaired’. We started with our first target: to build a stand-alone device — NaVI, Navigation for visually impaired. We choose the microprocessor as Raspberry pi 3.0 and bought a GPS module and got started by setting them up. We chose python for faster processing, an 8 GB SD card for more storage space and a 2500 mAh power bank for adequate battery backup. After many brainstorming sessions, we decided upon what all functionality would NaVI have. We wanted it to be a Smart device, which would not only give navigation directions at turns or other such locations, we also wanted it to warn the user if he is heading wrong, allow him to save new route while navigating, add landmarks like footpaths, traffic signals, etc while navigating. Firstly, we implemented the algorithm that would take a file and speak out navigation instructions at turns.

We discussed our weekly progress and plans with our mentor Mr. Piyush Chanana, Project Associate at Assistech Labs, IIT Delhi. His ideas and inputs helped us make considerable progress in this project and helped us brainstorm further about the additional functionalities we can add to our navigation device.

Now the stage was set. Soon, we divided our tasks and started working meticulously to deliver what could be the most helpful technology for the visually impaired.

One of the main features of NaVI is the addition of new routes by the user himself. This gives the user a freedom to create a route file of a new path that he navigates and use it for his further visits. All he has to do is press corresponding buttons at the turns. Moreover, he can add additional landmarks also while navigating.

On testing this program on some of the routes of IIT, whose route files had already been created, we found that it speaks the navigation instructions around turns and other points of interest, but there were two main drawbacks. The first one was the robotic output voice from espeak and the other one was the low accuracy of GPS. So some of us decided to take care of these issues, while others progressed to the further development. Remembering that our target user was visually impaired, the next target that we achieved was that of deviation handling and course correction. We implemented algorithms that would check for perpendicular and angular deviations and give alerts to the user in the form of vibrations and beeps. Moreover, the alerts will be given till he corrects his course.

Coming back to the issue of robotic sound of espeak, we found it’s solution in picoTTS, another open source tts with better sound quality and multiple language support. The GPS accuracy was improved by increasing the refresh rate and using the satellite based augmentation system. The results this time were considerably better. Although it was 4 am when they were achieved, the feeling was indescribable.

Having accomplished our navigation task, we went ahead towards designing a user-friendly interface for the visually impaired. We made use of a 4x3 Keypad and assigned different functionalities for different tasks. The same buttons could perform different functionalities in different situations. Finally, after rigorous testing and debugging, we were ready with a user-friendly and robust interface for navigation.

All this was done on Breadboard circuits, which were quite bulky and large in size. Our next goal was to reduce the size and weight of the circuit. We designed a PCB for switching Raspberry Pi ON and OFF and for controlling vibration motors and buzzers. We even desoldered the LAN Ports and USB Ports on the Raspberry Pi, which was a very tough decision to make keeping in mind the cost of Pi :P. This circuit was then enclosed in a compact 3-D Printed Case and finally, NaVI was ready to rock.

Each one of us knew they had delivered the best navigation technology for the visually impaired uptil now. And, our conscious efforts had helped us deliver this in a short span of three months.

Image for post
Image for post
Final Open House Presentation

Components Used

  • Raspberry pi 3.0
  • GPS module (Neo 6M)
  • Power bank (2500 mAh)
  • Bluetooth headset (BH702)
  • Keypad (4x3)
  • Vibration Motors
  • Buzzer
  • Arduino for configuring GPS

Technologies Used

  • 3D printing
  • PCB printing through Lithography

Once done, everything was integrated and made as compact and lightweight as possible.

Feedback and Suggestions

We tested our device with some visually impaired people and gathered their valuable feedback. They were very happy to use the device and appreciated our initiative and efforts. We incorporated some changes as suggested by them like:

  • The vibrations for slight left deviation and slight right deviation were not distinguishable. We changed the vibration pulse width ratios to make them distinguishable.
  • The offline TTS interface Pico had British and American accent which were not very understandable for them. Solving this problem is going to be our future goal.

Finally, we got an opportunity to demonstrate our device in Open House 2017 to groups of visually impaired people, middle aged as well as children, tech enthusiasts and professors. The overall feedback comprised of appreciations and some suggestions and was very satisfactory for us. Some of the valuable suggestions that we got are:

  • The TTS accent needs to be customised for the Indian users.
  • The time taken (around 30 sec) in the beginning, for GPS values to stabilise was slightly irritating for some visitors.
  • We were also given a suggestion to add the functionality of stopping the navigation in between and coming back to the starting point without switching off the device.

Future Scope of Development

We are really looking forward to pursue the project further for adding more accessibility features and doing optimizations to our device. We would be more than willing to incorporate new features, suggested by the viewers of this blog. As of now, we would mainly be focusing on:

  • Integrating NaVI and Obstacle detector to make SmartCane 2.0
  • Reducing Energy consumption of the device
  • Increasing the accuracy of GPS further
  • Customizing Text to Speech for Indian languages and accent

Contact Us

Written by

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store