Social Experience Design For Mobile Augmented Reality Head Mounted Displays

An exploration of Social Experience Design for Mobile AR HMDs using Experience Prototyping and Evaluative Research methods.

Adam Maz
21 min readJan 12, 2018

Introduction

An assumption that my research makes is that the point of entry to Augmented Reality will be ‘Mobile’ for most consumers. What do I mean by ‘Mobile’? Please allow me to expand. I believe that Users will experience and enjoy Augmented Reality using their phones or tablets well before they experience Augmented Reality using an Integrated Head Mounted Display like a Microsoft Hololens. This assumption is not baseless, nor was it made arbitrarily. Instead the aforementioned assumption was and continues to be informed by a reasonable amount of data such as market insights, trends, scholarly articles and exploratory data analysis. Moreover, I believe ‘accessibility’ is an important factor for predicting growth in adoption rates. While a Meta AR HMD or Hololens is a feat of engineering, owning one is too costly for most consumers with retail price-points ranging from $1,500.00 — $7,000.00 USD. At the opposite end of the consumer spectrum, in the world of black-budget Special Access Programs the costs of developing advanced AR HMDs are much higher. A joint-venture between defence contractors Rockwell Collins, Elbit Systems of America (RCEVS) and Lockheed Martin resulted in the design of an AR HMD so advanced it costs approximately $400,000.00 USD.

Prototype that is part of a 1 Trillion Dollar Special Access Project
Current F-35A Helmet

I believe AR HMDs such as Hololens and Meta will find adoption in various industrial, military and manufacturing applications but I do not think these units will find mass-adoption among consumers.

If costly integrated solutions do not appeal to consumers, then what will? I would suggest that Mobile AR technologies represent a familiar and accessible point of discovery for consumers. Google indicates that Augmented Reality will be on ‘hundreds of millions’ of Android devices in 2018. There are 505 million ARKit-compatible iPhones active today with 850 million projected by the end of 2020. I believe ‘accessibility’ will drive AR adoption rates. Moreover, I believe the future of Augmented Reality is categorically Mobile.

Despite optimistic adoption rates, the present state of consumer Mobile Augmented Reality leaves much to be desired. There remain several open issues for Mobile AR to address in areas that relate to Human Factors and general usability. As Matt Miesnieks states:

In the early days of mobile AR, we optimistically thought people would happily hold their phones up in their line of sight to get the experience of an AR app (we called it the Hands Up Display). We vastly overestimated how willing people would be to change behaviour. We vastly overestimated how accepting other people would be of you doing this behaviour. Your app will not be the app that changes this social contract. You’ll need your app to work based on how people already hold their phones”

The Hands Up Display

The ‘Hands Up Display’ is the Human-computer Interaction used by most Mobile AR experiences at the time of this writing. This type of Human-computer Interaction promotes short sessions of use that are about as long in duration as the time required to take a selfie. Consequently, Mobile AR becomes a way to visually answer a question rather than a way to pursue long-session engagement.

The ‘status-quo’ of Mobile Augmented Reality saddened me. It was painfully apparent that the full-potential of Mobile AR technology was not being realized due largely to a suboptimal Human-computer Interaction design that is far too reliant upon the unnatural use of one’s arms. The suboptimal ‘status-quo’ of Mobile AR and the ‘Hands Up Display’ inspired some divergent design-thinking on my part. My mind began to entertain questions of the ideative variety. I wondered to myself, ‘Has anyone made a mobile AR HMD for iOS?’, ’How might cost-effective Mobile AR HMDs improve Augmented Reality for people?’, ‘What if AR HMDs leveraged my phone’s computing power?’, ‘Could sessions be made longer with Mobile AR HMDs?’, and ‘Could we get rid-of the Hands Up Display?’.

Before I decided on my project’s objective, I did some Exploratory Research, Competitive Analysis and Benchmarking. The goal of my Competitive Analysis was to identify what companies were actively developing a cost-effective Mobile AR HMD that would appeal to consumers. After-all, to explore my questions and research objectives I would need some-type of Mobile AR HMD Hardware.

I expected that a quick Google search would produce dozens of early-stage start-ups to choose from. Contrary to my expectations, there was only one company with a well designed product and clearly articulated value proposition that I thought would resonate with consumers. I discovered a small team of Entrepreneurs, Designers and Developers in Los Angeles collectively operating as Mira-Labs.

Mira Leadership Team. Ben Taft, Matt Stern, Ben Stein, Evan Bovie

I was excited and pleased to learn that Mira had built an impressive Mobile AR HMD named the ‘Prism’. The Prism’s Dev Kit had a solid SDK complete with excellent technical documentation and tutorials. The Mira Prism appealed to me as a logical hardware decision to evaluate the questions my research presented me with. The Mira Prism retails for $150.00 and I felt this cost would be appealing to potential consumers.

Application Development for the the Mira Prism is Unity based. This was very appealing to me as I am a Product Designer and Software Developer with a background in Human Computer Interaction. I routinely work with Unity and Xcode IDEs and have strong proficiencies in iOS and Unity Application Development. At the time of my order, Mira was almost sold-out of the Prism Dev Kits. I placed my order for one of two remaining Prism Headset Dev Kits.

With the click of a button my journey had started. Little did I know that this journey would be a life-changing learning experience that would involve working directly with the Leadership at Mira in Los Angeles, consulting with my Supervising Professor Dr. James Oliver from ISU and analogue prototyping while travelling through the Yucatan Jungle.

Fun Fact: My research with Mira was overlapping with an Ethnographic Study I was doing in the Yucatan. I was knee-deep in Mira Research and just days before my intended departure, my appendix exploded. Luckily, I recovered rather quickly. I share this story with you because you’re unfortunatley going to see a youtube video of a highly medicated ‘me’. In this video I’m trying to explain a bug-riddled build that I shouldn’t have sent the COO Matt Stern. In retrospect it was a memorable and funny experience.

Mira Promotional Video

Project Objective

The broad objective of my research was to explore Social Experience Design for Mobile AR HMDs using Experience Prototyping and Evaluative Research methods. The project was given a name and called the ‘Tanoshii Project’. Tanoshii is Japanese for ‘delightful’ and a nod to both Universal Design Theory and Gunpei Yokoi from Nintendo. Yokoi is the Designer of Nintendo’s Virtual Boy.

Gunpei Yokoi

When I was 12 Gunpei’s Virtual Boy gave me the occasional headache but it inspired me to pursue a Career in the field of Human Computer Interaction and Product Design. I’ve included a really interesting read on the ‘Enigma’ of the ‘Virtual Boy’ below. I would encourage someone to turn Gunpei’s story into a documentary on Netflix. I found it to be a fascinating story.

The Tanoshii Project had two goals. The first goal of the Tanoshii Project was to develop the technical expertise and proficiency in Mobile AR Application Development necessary to explore my research objectives. The second goal was to perform some Hypothesis Testing in order to answer several questions I had regarding Mobile AR HMD Technology. The questions I had were, “Is Tim Cook correct in his assessment that the technology ‘doesn’t exist’ for quality Augmented Reality glasses?” and “Will people enjoy using AR HMDs on an everyday basis for prolonged durations?”.

To answer these questions I created an AR Application named ‘AlterLight’ for the Mira Prism Mobile AR HMD. The research methods I employed were Experience Prototyping and Evaluative Research. AlterLight is the name of the prototype I used during Experience Prototyping. The value proposition of my Application to a User is entertainment by way of a shared social AR experience.

Hardware and Software.

The hardware that I used for my project was a Mira Prism AR HMD, Mira Remote and Image Marker known as a ‘Launch Pad’. The Mira Prism has a 60-degree field of view and a total resolution of 1334 x 750.

Mira Prism Headset

Multiple Software Applications, Libraries and Plugins were used to complete my project. The Game Engine I used was Unity 2017. Some assets such as the floating Island were acquired from the Unity Asset Store. All object behavior and game-logic was scripted using C#.

The Mira SDK was used to implement Wikitude Computer Vision, the Mira Stereo Camera Rig, Gyroscope-based rotational tracking, Counter-distortion Rendering and support for 3DoF bluetooth controller and default event/raycast system.

I used the Unity Remote iOS Application for fast iterative development. The Unity Remote is a downloadable app designed to help with Android, iOS and tvOS development.

Unity AR Remote

The app connects with Unity while you are running your project in Play Mode from the editor. The visual output from the editor is sent to the device’s screen, and the live inputs are sent back to the running project in Unity. This allows you to get a good impression of how your game really looks and handles on the target device without the hassle of a full build for each test.

My Unity builds were compiled to an iOS target Application. Xcode was used to embed additional binaries like WikitudeMiraSDK.framework and to compile/run various builds on my iPhone7 target.

The Mira display is purely additive — The brighter a pixel is, the more prominent it will appear on the display. Anything that renders as black on the phone screen becomes the absence of light — and will appear transparent.

For these reasons, I used the Flatlighting Shader created by Bogdan Gochev. I also used a 3D asset for the Island. It’s optimized for mobile, and the gradients and bright colours look vivid and clear on the Prism display.

Bogdan Gochev’s Shader

All Raw Image files used for the GUI elements for the 3DUX/UI were designed using Framer’s Vector Editing Utilities. Assets were exported from Framer and then imported to Unity 2017. Framer was a very valuable tool that allowed me iterate quickly during Experience Prototyping and Evaluative Research activities.

Final Product

While wearing a Mira Prism AR Head Mounted Display Users play a game that simulates table tennis in 3D space called ‘Alter Light’. AlterLight is a next generation Marker Tracked AR Game for iPhone with iOS spectator mode enabled.

The Making of Alter Light

Inspiration for Alter Light

There were 3 points of inspiration for AlterLight. The first point of inspiration was Three-dimensional chess shown in Gene Roddenberry’s Star Trek.

“Spock-you seen the Doritos?”

The second point of inspiration was the arcade classic title ‘Breakout’ that was built by Steve Wozniak aided by Steve Jobs.

The Woz!

The third point of inspiration was Amazon’s space-pong. This is just awesome and really inspired the gameplay.

Amazon is doing some fantastic streaming.

AlterLight is similar to Atari’s title ‘Pong’ with a subtle difference — the Player controls an in-game Paddle by moving it across X, Y and an additional Z axis direction. Players can compete against either a computer-controlled opponent or another Player controlling a second Paddle on the opposing side. Players use the Paddles to hit a ball back and forth. In AlterLight, the ball careens off various objects in the game environment. The goal is for each Player to reach three points before the opponent. Points are earned when either Player or computer fails to return the ball.

To play AlterLight, Users open the AlterLight Mira-enabled app on an iPhone 7, 8 or X. You slide the phone into the Prism. The screen faces away from you and toward a transparent lens that attaches to the head strap magnetically.

Mira uses a rear-projection display technology. The SDK renders a stereoscopic view of your Unity scene first and then it laterally inverts and projects the image onto a magnetic lens that displays your app’s content. The lens has a patented curvature which ensures optical clarity and proportionally rendered polygonal content and media.

Some have criticized the Mira Prism indicating that objects look far less solid than those produced by Microsoft Hololens. This criticism suggests that optical clarity is an observable problem with the Prism that may adversely impact User experience. Having worked on a Mira Prism for 4 months, I believe those criticisms are not factually accurate. I would go so far as to characterize them as categorically wrong. I have worked extensively with Microsoft’s Hololens exploring various Industrial Applications for the headset. I would be misrepresenting facts if I did not concede that there is a noticeable improvement in optical clarity on a Hololens when compared directly with a Prism. While Hololens does give Users very high levels of optical clarity so does Mira’s Prism. The minor variance in optical image quality does not adversely impact User Experience on the Prism whatsoever. Optical clarity on the Prism is within a perfectly acceptable range for consumers. During Usability Testing and Task Analysis there was no visible error rate that could be attributed even partially to issues of optical image clarity on the Mira Prism. Users across multiple segments indicated how colourful and vibrant the polygonal content was. One User remarked, “Wow, this is so clear. It’s an entirely new way to see the world.”

Unlike other AR Head Mounted Displays, the Mira Prism is very light weight and extremely comfortable. I found that the comfort of the Mira Prism HMD promotes longer session lengths. I have worn my Mira Prism in excess of 2.5 hours consecutively with no observable discomfort.

The AlterLight game environment is fixed to an Image Marker using Marker Tracking and Computer Vision facilitated by Wikitude image and object tracking. The User can manipulate the environment’s position by rotating or moving the Marker with no interruption in game play. This ability to physically move the environment creates some really unique game-play.

Why the Name ‘Alter Light?’

I named my project ‘AlterLight’ because I was inspired by Donald Glover also known as ‘Childish Gambino’. Donald Glover is an American Actor, Producer and Rapper. Glover said that when he had to pick his ‘Rap Name’ he used an online Rap Name Generator to do so. There was a great deal of work to do on Product Design and Development while I was trying to settle on a title name. I didn’t have too much time to invest in a major exploration in Product Branding. Additionally, I was reminded of something my Mentors Jeff and Ian told me, “Shipping beats perfection.” As luck would have it, there were several online Video Game Name Generators. A Generator seemed like a great option due to time considerations. I quickly generated ‘AlterLight’ on a popular online Video Game Name Generator. ‘AlterLight’ was generated and by pure coincidence the words’Alter’ and ‘Light’ had some domain relevance to the properties of a ‘Prism’- the Application’s Target destination. The name seemed perfectly appropriate.

Relationship to Course Content

The teachings from ME/HCI 580 and Dr. Oliver’s assistance have contributed enormously to my ability to successfully undertake and complete this project. ME/HCI 580 provided a very strong foundation in C# scripting. Prior to taking this class, I was an Intermediate Python and iOS Developer with some experience in Unity. The strong foundation I gained in Unity Scripting during ME/HCI 580 gave me the confidence to conduct all of the development necessary for Experience Prototyping.

Experience prototyping involves exercises completed by design teams to foster a vivid sense of the User potential experience. Similar to role-playing, simulation exercises, and body storming, low-fidelity prototypes or props are used to help create a realistic scenario of use and activate the felt experiences of Designers or User. The method is advantageous for it’s low-cost, and for when situations prevent real life experiences because of inherent risks and dangers or complicated logistics. Experience prototyping facilitates active participation in design through subjective engagement with a prototype system or service, product, or place. Experience prototypes surround a prototype product or service with a simulated physical and/or social context of use.

My Applied Design and Research Process

My design and research process can be characterized as very ‘user’ and ’task-centered’.

The need for a task-centered approach should be obvious: if you build an otherwise great system that doesn’t do what’s needed, it will probably be a failure. But beyond simply “doing what’s needed,” a successful system has to merge smoothly into the user’s existing world and work. A system needs to fit comfortably within a user’s cognitive model.

Understanding of the users is critically important. An awareness of the users’ background knowledge will help the designer make key decisions. Less quantifiable differences in users, such as their confidence, their interest in learning new systems, or their commitment to the design’s success, can affect decisions such as how to provide feedback, 3D interaction techniques and where to use 3DUI elements in the 3D environment.

Phase 1 is Planning, Scoping and Definition, where project parameters are explored and defined.

Phase 2, Exploration, Synthesis and Design Implications is characterized by immersive research and design ethnography, leading to to implications for design.

Phase 3, is Concept Generation and Early Prototype Iteration, involving participatory and generative design activities.

Phase 4 is Evaluation, Refinement and Production, based on iterative testing and feedback.

Phase 5 is Launch and Monitor, the quality assurance testing of design to ensure readiness for market and public use, and ongoing review and analysis to course correct when necessary.

In order to properly evaluate and iterate through ideas I assumed the roles of Researcher, Designer and Developer. The approach I took was 1) Identify Requirements 2) Design 3) Implement 4) Evaluate 5) repeat. This process allowed me to iterate quickly and explore subsequent Experience Prototypes. I would quickly design a feature and then implement it in code and Experience Prototype the concept with segments of Users that I felt aligned with my target User category. At one point I involved my Wife and the COO of Mira Matt Stern in an Analogue Experience Prototyping session. This was a valuable experience that helped obtain useful data and valuable feedback from a domain-expert.

Early concept generation

I like to sketch-out the game.

Analog Prototyping with Matt and Julia

A good example of my approach to iterative Experience Prototyping was when I relied on User Feedback to change AlterLight’s Player Controls. Early in the game design process I decided that I would use ray-casting to control the Player’s Paddle position.

Note the use of Ray-casting.

During my first Experience Prototype it became apparent that the ray-casting interaction technique resulted in sub-optimal gameplay for Users. Participants in my study indicated that they found ray-casting object selection frustrating. After I identified ray-casting as an irritant for Users, I quickly changed the Player control and interaction methods. Instead of ray-casting, I used left, right, up and down User Input obtained from the Mira remote to control the Paddle’s position in 3D space. My second session of Usability Testing revealed that Users preferred and enjoyed the new Player Controls.

Professor James Oliver’s classes and readings on 3D Interface Design, 3D Interaction Techniques, Cognitive Psychology, Human Factors and general principles of Human Computer Interaction provided a strong foundation of knowledge that enabled me to Design and Prototype my AR HMD experience.

AR image tracking, computer vision and interaction were very important subjects that I had to study extensively to complete this project. ME/HCI 580 provided a very strong introduction to these subjects with a variety of case-studies that enabled me to identify frameworks and libraries that would add value to my project.

I attended Rafael Radkowski’s guest lecture. His lecture provided very useful insights about Markerless AR. Radkowski’s lecture went into some great depth on Feature Points, Point Clouds and Hit-testing. Hit-testing is great if you want to accurately align virtual objects to the real world. Mira uses a rear-projection display technology. The SDK renders a stereoscopic view of your Unity scene first and then it laterally inverts and projects the image onto a lens that displays your app’s content. This process does not allow us to capture sensor data from the front-facing camera on the iPhone7. At the present time I could not implement Markerless Tracking.

Summary

This has been an incredibly insightful journey and a defining learning experience that I will never forget. In my Project Objective statement, I enumerated several questions for which I now believe I have answers.

The first question I had was,

“Is Tim Cook correct in his assessment that the technology ‘doesn’t exist’ for quality Augmented Reality glasses?”

Before I answer this question I would like to be unambiguously clear that my research does not constitute a criticism of Mr. Cook. My work is a thoughtful evaluation of his statement. I am a very big fan of Mr. Cook’s leadership. He’s an AR Evangelist and Subject Matter Expert of the highest order. His lecture at the Utah Tech Tour was quite literally mind-expanding.

“Is Tim Cook correct in his assessment that the technology ‘doesn’t exist’ for quality Augmented Reality glasses?”

To answer that question, I think we need to expand on what defines a ‘quality’ AR experience. This may be a very subjective point of evaluation that may vary from person to person. What constitutes a quality experience to Apple? To answer this question we may want to explore Apple’s design standards.

Apple’s Design Practice is led by a visionary and personal hero of mine named Jonathan Ive. Mr. Ive’s designs are incredibly sensitive to the needs and goals of the User. Ive’s aesthetic is effortless, timeless, empathetic, austere and distinctly reminiscent of Bauhaus Design. There is no compromise between aesthetic and utility in Ive’s work — it’s a harmonious expression of function and beauty that I have been studying for over a decade. Jonathan’s large body of work suggests a deep understanding of Human Factors and User-centered Design.

Fun fact, I once worked with a colleague who did his internship at Apple and worked on the Mac Pro Tower late 2013 model. This colleague told me that Jonathan’s team went through countless iterations to identify the perfect finish for the computer’s enclosure. In 2012, Ive was made a Knight Commander of the Order of the British Empire (KBE) at Buckingham Palace for “services to design and enterprise”. One can reasonably conclude that Apple’s Design Leadership has exceptionally high standards that prioritize both the utility and aesthetic of a product.

With these aforementioned considerations in mind, I do not think Apple would release an Augmented Reality Head Mounted Display that was any larger than a pair of optical glasses. I would wager that Apple would want a very compact form and design so as to not impede utility or ease-of-use. Moreover, I would also imagine that Apple would have serious expectations for ‘iGlass’ computing. Apple would likely expect the iGlass to allow Users to check email, control music, display biometric data and much more. The only technology that I have found that would enable this functionality is an Advanced AR Display Technology called Light Field Displays.

This technology is still in it’s infancy and is nowhere near the compact scale Apple would likely desire to commercialize and create ‘quality’ value for it’s Users.

For Tim Cook, ‘quality Augmented Reality Glasses’ would likely be no larger than a nice pair of Armani Eye Glasses. They would likely be water resistant, absent of cables with some highly advanced and compact Light Field Display capable of providing contextual notifications and way-finding services to Users.

I have no doubt that under the Design Leadership of Jonathan Ive that Apple will realize a feat similar to that articulated above sooner than we think, however, I have different expectations for a pair of ‘quality AR glasses’. A quality AR HMD by my definition needs to be affordable, comfortable, easy-to-use with delightful and vibrant optical clarity. I want to be able to wear a AR HMD with my Wife and enjoy a socially connected experience like AlterLight or Netflix over a glass of wine. No ‘Hands Up Display’! I want the Design of the Mobile AR HMD to be mindful of the social context of use and my need to feel ‘cool’ while wearing it.

The question remains, “Is Tim Cook correct in his assessment that the technology ‘doesn’t exist’ for quality Augmented Reality glasses?”. I would suggest the answer depends on which definition of ‘quality’ you subscribe to and your specific needs as a User. The Mira Prism is the first Mobile AR HMD to properly respond to all of my needs at an affordable price of approximately $150.00 USD.

The remaining questions my project tried to answer were “Are people ready to adopt Mobile AR HMD Technology?” and “Will people enjoy using Mobile AR HMDs on an everyday basis for prolonged durations?”. Fully addressing these questions would surely require additional research that would exceed the scope of this project, however, my Evaluative Research and Experience Prototyping has helped me form several conclusions.

Data obtained during Usability Testing and Task Analysis indicates that Users can comfortably wear a Mira Prism in excess of 2.5 hours. I suspect session-lengths could exceed 3–4 hours depending on the social context of use. User goals and tasks were achieved easily with visible indications of delight. Interactions with the Mira Prism appeared to be intuitive for most Users. The Prism fit comfortably within User’s mental models. At no point did I observe anything that would suggest Users felt ‘silly’ wearing a Prism. Three out of five subjects enquired where they could buy a Mira Prism at the end of Usability Testing. Based on the aforementioned data I would conclude that consumers would likely adopt AR HMDs such as a Mira Prism and enjoy using them for long durations of time.

Reflections

I am very happy with net outcome of the Tanoshii Project. I believe that I accomplished my research objectives and found answers to my questions. I am particularly happy with how the game environment incorporates Marker-based tracking and Spectator Mode functionality. I did not have enough time to implement iOS multi-peer connectivity as I had intended. Were I to do this project again I would allow more time to explore multi-peer connectivity. Finally, I would use 3DUI elements as opposed to 2D Canvas Image objects. This may seem obvious and self-evident point, however, designing interfaces with a greater sensitivity for UI visibility in 3D space would be an opportunity that I would like to explore further.

Bibliography

Boland, Mike . “Roughly 380 Million iPhones are ARKit-Compatible.” Upload VR. August 08, 2017. Accessed December 01, 2017. https://uploadvr.com/380-million-iphones-are-arkit-compatible/.

CNBC, Arjun Kharpal. “Google says augmented reality will be on ‘hundreds of millions’ of Android devices next year.” CNBC. November 07, 2017. Accessed December 01, 2017. https://www.cnbc.com/2017/11/07/google-augmented-reality-will-be-on-hundreds-of-millions-of-android-devices.html.

Edwards, Benj. “Unraveling The Enigma Of Nintendo’s Virtual Boy, 20 Years Later.” Fast Company. August 21, 2015. Accessed December 01, 2017. www.fastcompany.com/3050016/unraveling-the-enigma-of-nintendos-virtual-boy-20-years-later.

Gochev, Bogdan. “Flight Lighting Shader.” September 26, 2017. Accessed November 1, 2017. https://www.assetstore.unity3d.com/en/#!/content/67730.

Google Inc. “Liquid Ping Pong in Space — RED 4K.” Liquid Ping Pong in Space — RED 4K. January 21, 2016. Accessed December 1, 2018. https://www.youtube.com/watch?v=TLbhrMCM4_0.

Kastrenakes, Jacob. “Tim Cook says the tech ‘doesn’t exist’ for Apple to make good augmented reality glasses.” The Verge. October 11, 2017. Accessed November 11, 2017. https://www.theverge.com/2017/10/11/16458944/apple-ar-glasses-tech-doesnt-exist-says-tim-cook.

Martin, Bella, and Bruce Hanington. Universal Methods of Design. Beverly, MA: Rockport Publishers, 2012.

Matney, Lucas. “An afternoon with Avegant’s prototype light field display headset.” Tech Crunch. April 7, 2017. Accessed December 1, 2017. https://techcrunch.com/2017/04/07/an-afternoon-with-avegants-prototype-light-field-display-headset/.

Meta Company. “Meta 2.” Buy the Meta 2 Augmented Reality Dev Kit. February 17, 2016. Accessed December 1, 2018. https://meta-canada.myshopify.com.

Microsoft corp. “Microsoft Hololens.” Microsoft Hololens. March 30, 2016. Accessed December 1, 2017. https://www.microsoft.com/en-ca/hololens/buy.

Miesnieks, Matt. “The product design challenges of AR on smartphones.” TechCrunch. September 02, 2017. Accessed December 01, 2017. https://techcrunch.com/2017/09/02/the-product-design-challenges-of-ar-on-smartphones/.

Mira Labs Inc. “Mira 101: Prepare for Launch.” Mira Reality. July 18, 2017. Accessed November 28, 2017. https://developer.mirareality.com/docs/mira-101-1.

Mira Labs, Inc. “Start building the future, today.” Mira Prism Augmented Reality Headset. July 18, 2017. Accessed November 21, 2017. https://www.mirareality.com/developers.

Moynihan, Tim. “IT’S A GOOD THING THE F-35’S $400K HELMET IS STUPID COOL.” Wired. June 10, 2016. Accessed December 01, 2017. https://www.wired.com/2016/06/course-f-35-comes-400000-augmented-reality-helmet/.

Silva, Fi . “Low Poly Floating Islands.” Low Poly Floating Islands. December 12, 2016. Accessed December 01, 2017. https://www.assetstore.unity3d.com/en/#!/content/76732.

Unity Technologies. “Unity Remote.” Unity Remote. August 12, 2017. Accessed December 1, 2017. https://docs.unity3d.com/Manual/UnityRemote5.html.

Wikipedia. “Universal design.” Universal design. October 23, 2017. Accessed December 1, 2017. https://en.wikipedia.org/wiki/Universal_design.

Wikipedia. “Gunpei Yokoi.” Gunpei Yokoi — Wikipedia. November 07, 2017. Accessed December 01, 2017. https://en.wikipedia.org/wiki/Gunpei_Yokoi.

Wikipedia. “Three-dimensional chess.” Three-dimensional chess. April 1, 2017. Accessed December 1, 2017. https://en.wikipedia.org/wiki/Three-dimensional_chess.

Wikipedia. “Breakout (video game).” Breakout (video game). November 1, 2017. Accessed December 1, 2017. https://en.wikipedia.org/wiki/Breakout_(video_game).

Wikipedia. “Pong.” Pong. December 23, 2017. Accessed December 1, 2017. https://en.wikipedia.org/wiki/Pong.

Wikipedia. “ Jonathan Ive.” Jonathan Ive. November 9, 2017. Accessed December 1, 2017. https://en.wikipedia.org/wiki/Jonathan_Ive.

--

--

Adam Maz

Ph.D. Student in Computer Science, studying Artificial Intelligence and Robotics.