Design an Augmented Reality Airport Navigation APP with Hololens
Last updated on December 07th, 2017
I would like to share our journey of designing and the developing of our Bachelor thesis Project AR-Port. In this article, I will mainly share our entire design process. I will write another article to explain the AR design principles that we have summarized.
AR-Port is an Airport Navigation APP for Hololens. We would like through this work to explore, how are the Design principles of Augmented Reality indeed.
We are a small group with two people, Yijian Lan, and Alexander Kübler. This project is our Bachelor thesis. We are very interested in the application of AR, and how should AR User Interface looks like in our daily life. So we decided in our Bachelor thesis to explore the Design principles of AR and apply it to an AR-APP.
We begin with the investigation of application fields. The first question of us is: In which field can AR really as an industry disruptor are applied? or In which Field can AR really make sense? There are many Applications and Devices on the market, and some of that is also really cool, but we still can’t find a really killer App on the Market. Most of the Apps provided users with a new experience, today are the most devices are designed to for business end but not for customer end. So we investigated, how about the chance of AR in a different field.
Eventually we picked Navigation at airport up.
Navigation is daily activities for everyone, at the moment there are more and more Navigation App on a smartphone. We think it’s a good theme to explore, what is different between smartphone and Head mounted display. The complex building structures of Airport just like a tiny city, it is not always easy to keep track. So many different Terminal and Gates, different people from different cultures and lands. These elements mean a great amount of information. How to appropriately combine this information to the real world, is a challenge for us.
Scope of Design
Core functions of our navigation App are Map, voice assistant, context-based Dashboard and Navigation.
Hololens is totally a new platform for us. How to design a User Interface and content in a 3D space is a whole new theme. But Hololens has much guidance for designer and developers. Those guidelines helped us a lot.
In the early stages of design, we try to find a way to start. It was important for us to create an understanding of our technical development environment. The Microsoft Hololens has an additive display, and it is able to map hologram in our real world. The user can observe the Hologram from any aspect. It just seems like a real object in our real world. That means, every content, that before exist in our smartphone, maybe had own place and will be in another form displayed in the real world. So we got to discussing how to handle the relationship between user, environment, and hologram.
In our research phase, we learned, that many designers believe, that we should avoid 2D assets as much as we can, that means the object will not really responding to the environment, e.g. light, collision, etc. We try to explore the different representation of the User Interface in our living space. At the first attempt, we made two variants with flat and 3D Interface. We try to figure out if some basic UI components(like buttons) in form of 3D in AR environment make any sense in point of view Usability.
After validation and user tests we still decide on a flat version for our Dashboard Interface. We realize, that the Button as a 3D form will take more time and unnecessary energy from designers. Though 3D version looks well at first glance, after a while user is aware, actually this form of 3D content compared with the 2D version, don’t offer more extra benefit. In contrast button like this can distract users’ attention from the interface and makes the interaction more complex. After all, the button is used to be operated rather than to be read or observed. We believe, in Design of Augmented Reality-UI, 2D UI components are still be needed and cloud very efficient guide users. You will never need a holographic clock to tell users time, but directly show them the information.
We started iterative constructing the Interface of Dashboard, map and voice assistant. The biggest challenges in the design process were, how to balance the user’s attention between reality and interface. If AR- HMD really like a mobile device in our life, the holographic should naturally exist in the user’s vision, but not block objects in reality. So we did a lot of variant and testing to ensure, that the typography of interface has good legibility and the interface relative to users should be alway in a suitable location in the environment. We spend a lot of time through user tests to get a proper result.
The guideline of Microsoft for Hololens and Holo Academy are really helpful for our work. With those, we save lots of time. “Tap-Along” and “Billboarding” are two very important concepts, and it inspires us a lot about interaction design of the Body-locked interface for this project.
Our work process
Certain color will appear totally different on the additive display of Hololens. e.g. Cool colors will tend to stay into the background, in contrast, warm color seems like on the foreground. Because of the additive display there also no black color on Hololens, because there no black light. That is why we have to always test our screen design on Hololens to ensure, that is what we would like to have. But it is not so easy through Unity and Visual Studio to deploy to Hololens and this process takes a lot of time.
We found own work process to let us faster iterative test our design. Through box and OneDrive we can directly open our screen-design on Hololens. Hologen is an App for holographic content sharing. We work with that to deploy rapidly our 3D model to Hololens.
UI Elements — Dashboard
The dashboard is the first part, what we design. We made a lot of variants. We asked our self, how should a flat UI for AR look like. The final version is shown above. The Dashboard no longer has a border. The world is our canvas, why should we with a border limit our interface again? White appears very bright on Hololens, usually should be used sparingly. But through a large number of user tests, we still used white with some appropriate transparency for our interface component to ensure the legibility of the text in different environments. About interaction design, unfortunately, as a developer, I was not allowed to define own gesture for Hololens, that is why we use a clickable blue cubic to simulate a gesture to awake the Dashboard. User’s hand will be always tracked, if the user’s hand be detected, the cursor will show the corresponding feedback to the user and let them know. Hololens is ready! With “Tag-along” and “Billboard” those two concepts we constructed our Dashboard Interface. Tag-along means, the interface will always follow you. “Billboard” makes the interface always face toward users. World-canvas give us a chance to explore some new feedback button. Out of the color as a feedback parameter, we also introduced “Z-axis” as a feedback parameter. When the button is in Hover state, it gets a floating effect.
UI Element — Map
The map is like a personal smart information point. Users can find with Map different position and information of shop at the airport. Map consist of three part: Toolbar, 3D Map and Information Center. With the toolbar user can move, zoom, scale, rotate the map. Users can through click building(i.g. restaurant, duty-free, healthcare,etc.) on the map to get corresponding information. There are some labels on the map shows the position of Information point and toilet. Besides that, there are also labels mark the destination of users. Toolbar, Information Center and all labels face always toward to user. Users can observe the map from different views and angels. Compare with 2D map this is alive. Through Information Center user can give any keywords that they would like to search. Results will be displayed in Information Center and 3D Map.
UI Element — Voice assistant
The conversational user interface has been discussed for a long time. At the moment it doesn’t work very well, because of the limitation of Technik. But it still a potential form of User Interface. For this Navigation application, we have a voice assistant for the navigation. Users talk with system naturally and get the information what they want. When users express their needs directly with words, they tell their needs directly. It not like a normal GUI, that user can only through the existed path and information architecture to find the result. If users would like to find some places to work, they don’t have to search a place with different hashtags, but just tell the system their requirements. e.g. quiet places, fewer people etc
Conquering Spatial UX is one of the most exciting challenges designers will face the coming years — Marnix Kickert
Navigation — Guide-system
When we enter the keyword “AR Navigation” on Google, search results are generally like this:
Those solutions are indeed the first idea, which will appear in people’s minds when we talk about the solution of AR-Navigation. But we doubt if this solution really works. The only way to clarify the question is user tests.
We use Actiongram to build our Low-fidelity prototype to test whether our opinions are correct. The result is, we are right. We put some guide elements on the floor and let our test participant just follow the guide element to find the pre-set destination. Everyone cloud finds the endpoint, but during the process test participant just ignore everything except out guide elements. One of them even hit the door, because he only focuses on the guide element, so that he just follow the guide and hit the door. We believe，put line or some other elements on the road or floor is not the right solution for AR-Navigation. After that, we try to find the answer through in our life. Basically, AR means, we put digital contents in our real world. To some extent, the virtual content here is the mapping of objects in the real world. So why don’t we just find the right solution in our real life? We would like our user to be seamless and intuitive guided.
So we found the Escape signs.
We refer to Emergency guide system’s design guidelines to design our own guide system. And we define the size of guide elements, a relative position that based on user, and so on. In our final design we use breadcrumbs(not completely same as web breadcrumbs), guide elements are world-locked and float in the air, but always face to users. Actually, it is very similar to traditional guide system in the Airport. But it is a really efficient way. Users don’t have to all the time focus on the holographic guide elements, only when they need and could do what they do. The number will also real-time show the remaining distance. Every 30 meters there will be a guide element, so users cloud find them with corresponding legibility at any time.
Part of our principles
- From Smart phone to HMD — Design for HMD and smart-phone is completely different. When we design UI for Smartphone, the screen of a smartphone is our canvas. To a certain extent, with this canvas has designers a high degree of freedom. The content is limited to the screen. On the other hand, ideally, the whole world is the canvas for HMD. There are more possibilities to build UI. The most important thing is that digital content and reality overlap. This is an advantage but also a disadvantage. Designers can put any content in the real world, and everything could be interactive! Sound great! But think about it, digital content in the real world, is that always a good deal? How should we present digital content to users? When is a great moment, when could digital content disturb user? This is the first question, what we have. The scenario is a very important element for UX design when we design an application for HMD, the scenario is absolutely decisive. We must precisely to think what is the user’s situation, what would like users to see. Where cloud we show users something, where absolutely no need to put a holographic there. The way of thinking is actually similar to normally application design, but the focus is different. Designers must define, how is the relationship between the User, real Environment and Virtual content. Especially we don’t have a general design principle yet today, as a designer we have to be more careful.
- Read the Guidline ! — This is actually not a new story, but it is so important. Guideline helps us save a lot of time and we cloud faster figure it out, in which direction should we go. We learn rapidly many basic design principles from an official guideline for our target platform. Some design principles cloud be applied to any AR-related device, some depend to a great extent on the parameter of hardware. Designers should identify that by himself.
- Know your new device and do many tests —In our project, Hololens is our target platform. There are so many factors, that is different from traditional Devices and we must pay attention to. For example, Hololens used additive display(see-through display). That means the light used to render the holograms is overlaid on the light from the real world. So when we design our GUI on our computer, we need to keep this point in our mind: the screen, that we design on the computer at the moment, probably seem something else on the Hololens. Before starting to design, every designer should read the official documentation and design guideline through. The only solution for remaining questions is to test. We know black don’t really exist in an additive display, and white appears very bright etc. But we still did a lot of User tests and Usability tests to reach our goal. At last but not least the hardware limitations cannot be ignored. For instance, the FOV of Hololens is just 30 degree. If our design could not adapt to it, then sometimes users can ignore much critical information. Sometimes that could damage User Experience, sometimes it could be dangerous.
- Keep the question in your mind— How to deal with the relationship between virtual content and our real world is a permanent question, because under the different context the answer could be very different. But there are still some principles that we need to generally comply with. These points I will write a another article to share with you.