Understanding Reality: Part 2
The AR world beyond computer vision
This is part 2 of a 2 part series examining how Augmented Reality recognizes and interprets the world around you. You can find Part 1 here
As Humans, we can detect using 5 senses, and combining them provides a depth of activities we can perform and enjoy. So when we design AR applications, why do we often limit it’s capabilities to using just one? While the visual data I explored in the first section of this article is both the most common and one of the most useful, it is not the only way that an AR application can understand the world. With the expansive world of the Internet of Things, or IoT, sensor data of all kinds can be integrated into experiences to take them well beyond the visual. Let’s take a look at some common ways AR is leveraging IoT today to create new and better experiences.
GPS
Some experiences are designed to work in specific locations, or orient themselves to help you find your way around. IN order to know where you are, applications could try to detect unique landmarks or specific points, or they could just leverage satellites to know where you are. With AR becoming more and more capable on smart phones today, it is no wonder GPS is becoming a commonly use way to orient and enhance AR application’s ability to provide content that fits your environment. Not only can the application understand where someone is and provide them with information about the area, or how to navigate it, but it can also re-introduce scarcity to digital content by anchoring the experience to a specific location. We can even build a live community experience by allowing users to view the same content together when they gather in a common space. All three of these can have a pretty big impact on what an application can do and in turn, unlock new value for AR.
Location specific context
Understanding where a user is physically located can place your application in the real world. This enables an application to react to visual data in the area more effectively by recognizing buildings, streets, etc. One of many uses for this is navigation, allowing a more seamless understanding of how to get from one location to another by blending GPS location with visual recognition of key landmarks. Google has done a wonderful job of this with google maps AR
Similarly, a user may be able to approach the ruins of a historically significant archaeological site and experience what the location was like in its prime This is one of many ways AR allows you to see the unseen in order to create an immersive experience.
Adding Scarcity Back to Digital Content
Everyone knows by now that digital content is infinitely repeatable. We do this every day by forwarding word documents, Watching the same Youtube video or streaming music. Because of this there has been the benefit in business of creating content and scaling it with ease. The problem is that freely reproducible content suffers from an overabundance of supply, driving the value of the content down. Just ask musicians how downloaded music affects their livelihood. While there are licenses that can control this, These can only go so far.
Pokemon Go is one of the most popular use cases for this in AR. Users need to go to different locations to catch pokemon. While this content isn’t important to tie to the specific locations these pokemon are found, the importance is placed on getting the user moving. You can’t get every pokemon from a single location, you have to be in multiple different locations to find them, creating the ability to limit what a user can find without them physically visiting different areas.
Another example is adding content in a location that other users can view it. This is used effectively by wallame to allow users to leave digital “graffiti”. This effectively combines image recognition with location to allow users to create their own custom designs in a specific location. By doing so users can find context, experience art or anything else digital “graffiti” can add to a scenario.
Distance
Using tools like ultrasonic sensors, an application can more accurately read distance than it could with a camera. This can be crucial when actual distance matters. One of the best use cases for this kind of technology is showing distance to other cars in a vehicle’s AR head up displays (HUDs). An ultrasonic sensor could be employed to detect number of car lengths away and display this to a user.
Direction
Magnetometers are included in all smartphones and many other devices these days. These allow you to detect what direction a user is facing, both in terms of cardinal direction and relative to where they were facing before. These two approaches can both be used to benefit AR experiences
Cardinal Directions
If the application cares about real world scale, then understanding that a user is facing North, for example, can help determine what content should be generated. This could be used in a Heads Up Display (HUD) to show the user’s direction, or to create labels in the distance to show what the user might be looking at. It could also be combined with GPS data to help them orient themselves.
Relative Directions: Wayfinding
If the application does not care about cardinal directions, it may still care when the user turns where they are facing. This can be used alongside a reference coordinate system generated when the application is launched to change content being displayed. This is frequently used for wayfinding
Temperature
While temperature is more limited in how it can provide value, it can be crucial data in some contexts. For example, Quake Tech has developed an AR firefighters mask called C-Thru, which utilizes a specialized thermal camera with a see through display in order to help firefighters recognize people and objects through dark, smoke filled rooms. This application of temperature sensing can have a major effect on saving the lives of both civilians trapped in a building as well as the firefighters working to help them.
Gas/Smoke sensor
Similarly, Smoke and Gas sensors could be applied to Heads Up Displays (HUDs) for firefighters or other first responders to let the user know about hazards that have been detected by those sensors. This would allow them to recognize levels of gas or smoke and identify where they might be coming from, or when they should take extra precautions.
Alternatively, gas sensors can be applied to displays for workers on oil rigs and other locations where there could be leaks. While the main goal of these tools may be for on-the-job training or other focuses where AR can help the user perform a task, there could be gas sensors integrated to warn the user of dangerous conditions before they get too close.
The Possibilities Are Endless
So far I have highlighted some of the many sensors that make a difference in the world of AR, but there is so much more out there. Moisture, tilt, sound, humidity, etc. could all be valuable in different scenarios. The number of sensors out there today is expanding rapidly, and it can be really helpful to consider everything you want to track when designing AR applications. When the world of IoT meets that of AR, big things can happen!
So, What Do I Use?
With so many options for understanding reality, it is rare that you will have the time or resources to implement all of them. Each application has different needs and each team has different resources. In the end, it is important to narrow down your requirements to what is most important. Each industry will have different things that may need to be sensed and each option for hardware has limitations on what it provides or could integrate with. While some of these sata readings may best be used in a simple dashboard, there may be some that make sense to leverage for AR. A good way to start here is to use sensors you already have integrated, or to add one at a time. Make sure the data can be turned into a meaningful visual queue and that that queue is helpful when contextualized next to the physical world. Getting the sensors integrated may be a challenge, but adding one or two to your experience can have a big impact.