Design an Augmented Reality Airport Navigation APP with Hololens

Explore AR design principles through the Design Project

Last updated on December 07th, 2017

I would like to share our journey of designing and developing of our School Project AR-Port. AR-Port is a Airport Navigation APP for Hololens. We would like through this work to explore, how are the Design principles of Augmented Reality indeed.


We are a small group with two people, Yijian Lan and Alexander Kübler. This Project is our Bachelor thesis. We are very interested in the application of AR, and how should AR User Interface looks like in our daily life. So we decided in our Bachelor thesis to design a AR-APP to explore the Design principles of AR.

Kick off

We begin with the investigation of application fields. The first question of us is: In which field can AR really as a industry disruptors be applied? or In which Field can AR really make sense? There are many Applications and Devices on the market, and some of that are also really cool, but we still can’t find a really killer App on the Market. Most of Apps provided users with new experience, today are the most Devices to Business but not to Customer. So we investigated, how about the chance of AR in different field.

Eventually we picked Navigation at airport up.

some data from Goldman Sachs

Navigation is daily action for everyone, at the moment there are more and more Navigation App on smartphone. We think it’s good theme to explore, what is different between smartphone and Head mounted display. The complex building structures of Airport just like a tiny city, it is not always easy to keep track. So many different Terminal and Gates, different people from different cultures and lands. These elements means great amount of information. How to appropriately combine these information to real world, is a challenge for us.

Scope of Design

Our core functions are Map, voice assistant, context-based Dashboard and Navigation.

Designing AR-Port

Hololens is totally a new platform for us. How to design a User Interface and content in a 3D space is a whole new theme. But Hololens has many guidance for designer and developers. Those guidlines helped us a lot.

First shoot

At very begin of Designing we try to find a way to get start. Hololens has additive display, and it show us a hologram in the real world. User can observe the Hologram from any aspect. It just seems like real object in our real world. That means, every content, that before just exist in our smartphone, maybe has own place and will be in another form displayed in the real world. So we got to discussing about how to handel the relationship between user, environment and hologram.

Button UI

In our research phase we learned, that many designers believe, that we should avoid 2D assets when possible, because that means the object will not realistically respinding to the environment, e.g. light, collision, etc. So we try to explore different representation of the User Interface in our life space. At the first attempt we made two variants with flat and 3D Interface. We try to figure out, if button in form of 3D in AR enviroment make any sense in point of view Usability.

First attempt

After validation and user tests we still decide on flat version for our Dashboard Interface. We relalize, that the Button as 3D form will take more time and energy from designer. Though 3D version looks well at first glance, but after a while user are aware, actually this form of 3D content compared with 2D version, don’t make any new sense. In contrast button like this can distract user’s attention from interface. After all, the button is used to be operated rather than to be read or observed .

Hololens Screen shot

Iterative Design

We started iterative constructing the Interface of Dashboard, map and voice assistant. The biggest challenges in design process were, how to balance the user’s attention between reality and interface. If AR- HMD really as a mobile device in our life, the holographic should naturally exist in the user’s vision, but not block objects in reality. So we did a lot of variant and testing to ensure, that the typography of interface has good legibility and the interface relative to user should be alway in a suitable location in the environment. We spend a lot of time through user tests to got a proper result.

The guideline of Microsoft for Hololens and Holo Academy are really helpful for our work. With those we save lots of time. “Tap-Along” and “Billoarding” are two very important concept, and it inspirit us a lot about interaction design of Body-locked interface for this project.

Our work process

Certain color will appear totally different on additive display of Hololens. e.g. Cool colors will tend to stay into background, in contrast warm color seems like on the foreground. Because of additive display there also no black color on Hololens, because there no black light. That is why we have to always test our screen design on Hololens to ensure, that is what we would like to have. But it is not so easy through Unity and Visual Studio to deploy to Hololens and this process take a lot of time.

We found own work process to let us faster iterative test our design. Through box and OneDrive we can directly open our screen-deisgn on Hololens. Hologen is a App for holographic content sharing. We work with that to deploy rapidly our 3D model to Hololens.

Our process

UI Elements — Dashboard

AR-Port Dashboard

Dashboard is the first part, what we design. We made a lot of variant. We asked our self, how should a flat UI for AR look like. The final version is shown above. The Dashboard no longer has a border. The world is our canvas, why should we with a border limited our interface again? White appears very bright on Hololens, usually should be used sparingly. But through a large number of user tests we still used white with some appropriately transparency for our interface component to ensure the legibility of the text in different environment. About interaction design, unfortunately developer are not allowed to define their own gesture for Hololens, that is why we use a clickable blue cubic to simulate a gesture to awake the Dashboard. User’s hand will be always tracked, if user’s hand be detected, the cursor will show the corresponding feedback to user and let they know. Hololens is ready! With “Tag-along” and “Billboard” those two concept we constructed our Dashboard Interface. Tag-along means, the interface will always follow you. “Billboard” makes the interface always face toward user. World-canvas give us a chance to explore some new feedback button. Out of the color as a feedback parameter we also introduced “Z-axis” as a feedback parameter. When the button is in Hover state, it get a “float effect”

AR-Port Dashboard

UI Element — Map

AR-Port Map

Map is like a personal smart information point. User can find with Map different position and information of shop at airport. Map consist of three part: Toolbar, 3D Map and Information Center. With the toolbar user can move, zoom, scale, rotate the map. User can through click building(i.g. restaurant, duty free, healthcare,etc.) on the map to get corresponding information. There are label on the map show the position of Information point and toilet. Beside that there are also labe for the destination of user. Toolbar, Information Center and all labels face always toward to user. User can observe the map from different views and angels. Compare with 2D map this is alive. Through Information Center user can give any keywords that they would like to search. Results will be display in Information Center and 3D Map.

UI Element — Voice assistant

Voice Assistant

Conversational user interface has been discussed for a long time. At the moment it don’t work very well, because of limitation of Technik. But it still a potential form of User Interface. For this Navigation application we have a voice assistant for the navigation. User talk with system naturally and get the information what they want. When user express his needs directly with words, he say something what he really want. It not like a normal GUI, that user can only through the existed filter to find the result. If user would like to find some places to work, he don’t have to search a place with different filter, but just tell the system requirements. e.g. quiet places, less people etc.

Conquering Spatial UX is one of the most exciting challenges designers will face the coming years — Marnix Kickert

Navigation — Guide-system

When we enter the key word “AR Navigation”on Google, search results are generally like this:

Search results from google

Those solutions are indeed the first idea, which will appear in people’s minds, when we talk about the solution of AR-Nagivation. But we doubt if this solution really works. The only way to clarify the question is user tests.


We use Actiongram to build our Low-fidelity prototype to test whether our opinions are correct. The result is, we are right. We put some guide elements on the floor and let our test participant just follow the guide element to find the pre-set destination. Everyone cloud find the end point, but during the process test participant just ignore everything except out guide elements. One of them even hit the door, because he only focus on the guide element, so that he just follow the guide and hit the door. We believe,put line or some other elements on the road or floor is not a right solution for AR-Navigation. After that we try to find the answer through in our life. Basiclly AR means, we put digital contents in our real world. To some extent, the virtual content here ist the mapping of objects in the real world. So why don’t we just find the right solution in our real life? We would like our user be seamless and intuitive guided.

So we found this...

We refer to Emergency guide system’s design guidelines to design our own guide system. And we define the size of guide elements, relative position that based on user, and so on. In our final design we use breadcrumbs(not completely same as web breadcrumbs), guide elements are world-locked and float in the air, but always face to user. Actually it is very similar to traditional guide system in Airport. But it is a really efficient way. User don’t have to all the time focus on the holographic guide elements, only when they need, they cloud take a look and then they could do what they do. The number will also real-time show the remaining distance. Every 30 meters there will be a guide elements, so user cloud find them with corresponding legibility at any time.

AR-HMD Navigation

What we learn & Our principles for AR

  1. From Smart phone to HMD - UI Design for HMD and smart-phone is completely different. When we design UI for Smart phone, the screen of smart phone is our canvas. To a certain extent, with this canvas has designer high degree of freedom. The content is limited to the screen. On the other hand, ideally the whole world is the canvas for HMD. There are more possibilities to build UI. The most important thing is, that digital content and reality overlap. This is advantage but also disadvantage. Designer can put any content in the real world, and everything could be interactive! Sound great! But think about it, digital content in the real world, is that always a good deal? How should we present digital content to user? When is a great moment, when could digital content disturb user? This is the first question, what we have. Scenario is a very important element for UX design, when we design a application for HMD, scenario is absolutely decisive. We must precisely to think what is the user’s situation, what would like user to see. Where cloud we show user something, where absolutely no need to put a holographic there. The way of thinking is actually similar to normally application design, but focus is different. Designer must define, how is the relationship bewteen the User, real Environment and Virtual content. Especially we don’t have a general design principle yet today, as designer we have to be more careful.
  2. Read the Guidline ! - This is actually totally not a new story, but it is so important. Guideline help us save a lot of time and we cloud faster figure it out, in which direction should we go. We learn rapidly many basic design principles from official guideline for our target platform. Some design principles cloud be applied to any AR-related device, some depends to a great extent on the parameter of hardware. Designer should identify that by himself.
  3. Know your new device and do many tests - In our project Hololens is our target platform. There are so many factors, that is different from traditional Devices and we must pay attention to. For example, Hololens used additive display(see-through display). That means, the light used to render the holograms is overlaid on the light from the real world. So when we design our GUI on our computer, we need to keep this point in our mind: the screen, that we design on the computer at the moment, probably seem something else on the Hololens. Before start to design, every designer should to read the official documentation and design guideline through. The only solution for remaining questions is to test. We know black don’t really exists in additive display, and white appears very bright etc. But we still did a lot of User tests and Usability tests to reach our goal. At last but not least the hardware limitations can not be ignored. For instance, the FOV of Hololens is just 30 degree. If we don’t do something to adapt to it, then sometimes user cloud ignore, what we would like to show them. Obviously that will damage User Experience.