Why I’m Excited about the Future of Aira

Kevin Ray
Aira
Published in
7 min readJul 24, 2017

by Suman Kanuganti

Pictured: An illustration of the Augmented Reality experience that Aira provides. An Explorer wearing Aira smart glasses is accompanied by an Aira Agent describing what’s on the street around the Explorer. Conceptually this is what Aira provides: an Agent, via the camera in your glasses and your headphones gives you access to visual information that will guide or assist you, almost as if they are walking with you. The caption on the image says “Think like a set of eyes — not a brain”

At Aira, we are on a mission to positively impact the world. We do this by enhancing or unlocking potential in all humans by effectively utilizing Augmented Reality. There are 1.2 billion people in the world with disabilities. That is almost the population of China. Globally, Aira is currently targeting 300 million blind and low vision people. We believe the daily challenges faced by the BVI community are not due to vision loss but to lack of immediate access to information which members of the sighted community take for granted. At Aira, we solve this access problem by leveraging a confluence of technology trends such as IoT, smart glasses, widespread bandwidth and human-assisted AI to make any information that is now inaccessible for the blind and low vision instantly accessible.

Our technology works by connecting the blind, at the tap of a button on their Aira app, to a network of trained, professional remote agents who access video, data and sensor streams from the smart glasses worn by blind users. This allows agents to truly immerse themselves in the user’s environment as they provide information, access and assistance to the blind for virtually any activity at hand — ranging from simple things such as reading and cooking at home, to complex activities such as navigating busy streets and traveling. We operate like a set of eyes, not like a brain.

Larry Bock, a legendary serial entrepreneur and himself legally blind, once said to me, “You know for the first time, as a blind person, you are wearing a piece of technology that everybody else in the room wants to wear. It’s really something fundamentally different.” I knew right then we were onto something truly special.

So why do our team members and Explorers alike think that Aira is a “game changer” in enhancing the independence, mobility and productivity of blind and low-vision people — as well its potential to impact other healthcare sectors? I think the answer lies in three truths that give us reason to be excited and to be motivated to deliver even greater independence and autonomy for the BVI community.

Aira is the first broadly useable AR application for consumers

Creating technology for the consumer market is not so much about the technology itself, but instead how beneficial the technology is to the user who purchases it. In the consumer market, Aira is defining and driving a move to deliver Augmented Reality (AR) via smart glass technology. AR, on its own merits, will bring consumers tremendous benefit as applications for everything from travel services to home repair sites tap into its potential. Smart glasses are a logical platform for the integration of the digital and physical worlds, since we perceive it largely with our vision.

Although seemingly ideal as an AR platform for consumers, smart glasses have lagged behind smart phones for everyday use. The design, form factor, computing capabilities and power consumption have simply not kept pace. Google made an attempt in 2013 to successfully enter the consumer market with its smart glass offering, but it was rejected by the consumer market it was intended to open. It certainly did not get to the scale that application developers required in volumes required to make it a mainstream consumer device. Since that time, much of the focus around AR for smart glasses has been for solutions that can be deployed in the enterprise market.

Although development and delivery of smart glass variations have continued to emerge, none have been widely adopted and solutions and services for the mainstream are slow in coming. For the most part, the devices have found their way into niche environments such as clinical and surgical, maintenance and repair, manufacturing or construction environments.

,So against this backdrop of failed efforts to deliver smart glasses-based AR to the consumer markets, why are we excited? It is very simple. Aira, is the first-ever commercially available Augmented Reality application offered direct to consumers in an open environment.

To frame the experience that we are delivering to consumers, let’s look at the definition of AR from Wikipedia: Augmented reality (AR) is a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data.

Within that definition of AR, let’s look at what Aira delivers. Our service provides a live, direct view of the physical real-world environment. Via the smart glasses worn by our Explorers, Aira augments whole elements of this environment through computer-generated sensory inputs, such as video, sound, sensors, GPS as well as augmenting elements from sensory outputs (via human in the loop), such as sound and tactile factors.

The BVI community sees value in Aira’s technological roadmap and our compelling business model

An Aira Explorer, Bob Sellars, once said: “Many of the activities we do with Aira could have been accomplished through other means. With Aira, our time becomes our own. We can operate on our schedule. It’s freedom at a new level.”

We can probably agree that this is a core purpose of technology — to make our lives more convenient and hassle-free and to give us more freedom and options. I can think of a number of examples of this in my own life. For example, I drive a lot for work. Now I can pull up Google maps GPS on my smart phone anytime I’m heading somewhere new. As part of my business life, I often receive printed checks. Now, thanks to an app, I don’t have to find a bank to deposit them.

Even more fundamentally, technology gives faster access to information that makes our lives more efficient. Technology, implemented properly, frees up time which we can choose to spend however we wish. At Aira, our mission is to provide that instant access to both the physical and digital world for blind and low-vision people, and in the process enhance the quality of life for this community as a whole. How well we do that can be measured in how much time we can put back into our Explorer’s days.

Already we are seeing progress on this front. Our Explorers are inventing new ways of using this technology every day, assisted by Aira Agents. A few of my favorites: a blind mom reading bedtime stories to her child every night; a student learning to play the guitar; a dad installing baby-proofing locks in his kitchen for his toddler; a son attending and interacting at his father’s funeral; and an employee troubleshooting his JAWS Screen Reader at the office. These are people doing every day things that many of us take for granted, but that may not have been as easy just a year or two ago.

These Aira Explorers are truly that; “explorers”. They are helping us explore ways that our services put more hours back into their days. We strive to be a learning organization and what we learn from these users every day is invaluable. They are helping our team define the journey that we are on to ensure that human potential is fully realized by making it possible to do every day things faster, more independently and ultimately more autonomously.

So this underscores the second reason we are so excited about our mission at Aira. With scale comes the ability to make this unique service ubiquitous. As the community of Aira Explorers grows, it will become even more viable with multiple payers participating in an ecosystem of value, making the service more affordable for millions of people with vision loss. The economic impact will be significant as the newly empowered BVI community gains greater independence and autonomy, driving down the $48B in costs borne each year in the US alone due to lost productivity and inefficiency among the blind and low vision.

Go to market: Aira shifts AI from dashboard to device with human-in-the-loop approach

As a techology-enabled service, Aira is recognized for being innovative, high value and disruptive. The service was named “Best in the Show” at CES 2017 and at Mobile World Congress 2017. Among incredible competition at these and other high-profile tech events, Aira consistently stands out as a novel concept tapping the potential within Augmented Reality to deliver compelling and meaningful user impact. But beyond the recognized user benefits, Aira appeals to the technical community for the type of data we use and how we process it. Recent efforts by technology leaders like Microsoft, Google, Amazon, Nest and Waze are to deliver AI and machine learning capabilities for singular sources and applications. Important, but siloed work.

Rather than being an application devoted to processing a singular data type, Aira instead represents a fusion of several — -a core tenet of an AR solution. And this application of machine learning associated with the fusion of video/images, audio interaction, and a range of sensors geared toward interacting with the physical world is unique. It is what makes Aira special as a platform to truly deliver AR to the consumer market, or bridge into any number of industrial applications.

Aira is perfectly set up to train machines to interact with the physical world. Imagine how this rich data, in a variety of types and from a range of sources might be used to train robots to accomplish any number of tasks, constantly being updated and adapting to change.

Some might think of this as a future state. But at Aira we like to say the future is now. Here is an example of what I mean as it applies to AI. The Aira approach is that we go to market with a “human in the loop”. Our human agents, are constantly interacting with users, gathering information, and utilizing their AI-enabled dashboards to monitor, gather and provide information. While delivering the Aira service via the dashboard, those Agents are validating AI results for specific tasks with users. When accuracy for such autonomous tasks hits a certain threshold, that capability will be available for the user directly on his or her Aira smart glass device through interaction via a dialog system, similar to Alexa or Siri. The Aira autonomous agent, Chloe, is named after a main character from the hit TV series “24”. (I will discuss Chloe in more depth, soon.). In short, Chloe will be the first AI engine that learns to interact with the physical world.

Like I said before, we tend to think a lot about the future. We believe the future starts now. With Aira.

--

--