A bus that speaks sign language? Meet Olli.

How AI and IoT can help build autonomous transportation for all

Imagine: a self-driving shuttle bus visually recognizes disabled passengers as it picks them up, activating a ramp so they can board. It guides visually impaired passengers to empty seats. It understands and speaks sign language, and even reminds people if they leave a bag under a seat.

All this will soon be possible with Accessible Olli, an AI-enabled, electric shuttle bus created jointly by Local Motors, IBM, and the CTA Foundation, and enhanced with the newest technologies from 15+ leading technology partners — from navigation systems and AI to additive manufacturing and connected devices. Olli is being showcased at CES 2018 and will soon head to Washington, DC for additional exposure and development.

Gina O’Connell from Local Motors said that her company is looking at accessibility from four different perspectives — visual, audio, cognitive and mobility — to address the needs of an estimated 15 percent of the world’s population who experience some form of disability, including many who never leave home because of transportation difficulties.[1]

So how does Olli communicate with the visually impaired or hard of hearing?

Olli safely guides visually impaired passengers to empty seats through audio cues and an array of haptic sensors that enable them to feel vibrations on their hands when they arrive at an open seat, according to Drew LaHart, the program director for IBM’s accessibility division.[2]

For the hearing impaired, LaHart said that Olli will recognize sign language using machine learning and image recognition capabilities — and even respond via a hologram of a person using sign language.

The Angle:

To power Olli’s accessibility, artificial intelligence and IoT technology form a one-two punch to collect and analyze mountains of interaction and driving data in milliseconds.

Sachin Lulla, global vice president for automotive strategy and solutions leader at IBM, said that Olli’s AI-powered riding experience would help “build the world’s most accessible vehicle” for people with disabilities.[3]

Besides accessibility, AI technology also helps Olli constantly learn the best routes around a city and make recommendations to passengers about restaurants or if they should bring an umbrella as rain is forecasted for later in the day.

“It’s about mass personalization,” Lulla said. “We have to cater to everyone’s personal preferences.”

And Olli can be quickly made. Lulla said shuttles are 3-D printed in 10 hours, but may soon take just three hours to print. Olli’s design can easily be configured for taxi companies, corporate campuses, or other uses.

“Consumers want things now, or at least in a matter of days. It’s no longer about five-year vehicle cycles,” Lulla said.

Want to learn how manufacturers can rapidly design and build products? We created a Model Factory simulator to show AI and data solutions in action.