If we can guide a rocket through the air, through the variable forces of wind, into the sky, past the atmosphere, into Earth’s orbit — accounting for all the forces of nature that could potentially obstruct us from doing so — surely we can propel humanity towards a stance of more day-to-day thoughtfulness and perspective. There are variables, but we are closer than ever before to understanding and accounting for these variables, using machines that mediate our every move and every word.

What inspired me today was watching the SpaceX team celebrating the victory of the Falcon 9 on a droneship. To see the mission control station — hundreds of people jumping up and down, hugging each other, ecstatic — for collective human achievement, for accomplishing what felt at one time to be a seemingly insurmountable task — this is the spirit of the future, and the spirit required to guide one of the most mercurial ships — the human. …


Image for post
Image for post
Gustave Caillebotte — Paris Street; Rainy Day (1877)

What makes up a moment?

Nothing happened, but the image lives in my mind, vividly. I don’t remember the date, or occasion — it was a small and simple millisecond in which my mind was quieted by the beauty of it all.

I was returning home around 10pm after a long day at class. Turning onto Ridgeway from Chicago’s bustling Belmont Ave, I started down the street, now only a block from home. Under dim streetlights, I gazed down a quiet residential avenue of orange sidewalks and shadowed stoops, brick walls cast yellow by the same illumination; a light breeze rustled my sweater on the otherwise lukewarm August evening, affirming my choice to put on the extra layer of clothes that evening; leaves and branches rubbed shoulders above, a chorus of grassy whispers; a wash of passing cars, ebbing and flowing, receded into the background behind me; my shoes scraped lightly against the textured concrete. Somehow all these sensations melded together into a moment of pure contentment. …


Image for post
Image for post
Developing a VR environment for Google Cardboard in Unity 5

This weekend marked my foray into the world of VR design. Working in Unity 5, I scripted up some basic movement controls, and used the CardboardSDK and the Unity Remote iOS app to project the Unity output to iOS — Google Cardboard. Here are a few takeaways from my first virtual build.

Scale

In Unity, I felt ready to jump in and start building the world — constructing walls, adding objects, applying textures. As it turned out, I actually spent a good amount of time at the outset working to achieve what felt like a lifelike scale for my environment.

Forums tell me that Unity’s in-editor units are meant to roughly resemble the metric system (i.e. 1 unit ~ 1 m). Establishing the height of my main camera was the first step here. I opted for 1.5 units, as I found the feeling of being slightly “dwarfed” more exciting and magical than he feeling of being slightly “big” for the environment. …


I often get asked by friends why I’ve taken such an interest in AR. After all, many of them know me as a Luddite: I got my first smartphone in 2013; I’ve removed myself from Facebook, Instagram, and Twitter for almost two years now; and I’ve gone months at a time living “disconnected” at home. In short, I’ve gone to great lengths to limit the role of tech in my life. Why, then, am I now so focused on designing and developing for this new platform?

There are several reasons, but a core principle is that AR will be the platform that allows the tech in our lives to “fade into the background.” I recently read “The Coming Age of Calm Technology,” a short article published in 1996 by Xerox PARC researchers Mark Weiser and John Seely Brown. Weiser and Brown provide a brief history of human-computer relations — beginning with the advent of mainframe computers, and eventually projecting ahead to an era of “ubiquitous computing.” …


The concept of “affordance” becomes especially relevant in an augmented reality context. Industry and design have developed a series of affordance precedents in web / screen UI, centered primarily around color, buttons, and cursor styles. In our excitement to generate demonstratable, interactive AR interfaces, we should be cautious of the temptation to copy these affordances directly from web and screen design.

Instead, we should return to affordance theory that developed before digital design was prevalent, one that accounts for the chaos of environmental factors.

When, in evolutionary history, did affordances first become relevant? In looking at a banana, for example, one has an impulse to peel it open. In considering the development of primitive tools (admittedly not without thinking of Stanley Kubrick’s Dawn of Man), it’s worth considering what attributes and cognitive processes lead us to view a tactile, 3D object as usable. Does consideration of the task predicate the recognition of an affordance (i.e. Do I only see a bone as a weapon if I’m about to fight)? Or does something about the form of the tool evoke a neural association with a previously performed action, leading to an idea for future use (i.e. As I pick through a pile of bones, something strikes me about the shape of the bone, and how it fits into my hand — and I come to an eventual realization that this could be used to strike and hurt another animal?). …


Skeu

DESIGNATION started this week — it’s an 18-week UI/UX bootcamp. I’ll be doing the online portion from Massachusetts for the first 8 weeks, but the bulk of the program will be on-site in Chicago starting in April.

Getting back to Chi — couldn’t be happier. Having lived between Boston and New York over the past two years, I still think Chicago has this untouchable charm. It’s young, it’s willing to experiment; it’s confident, but not arrogant; and despite the weather and the violence, it somehow feels steady and hospitable at the same time.

DESIGNATION won’t teach anything specifically about augmented reality design, but I’m hoping that some of the core UI/UX principles will be applicable to next-wave platforms. Especially because UX/UI will be even more important in AR than it is in a screen-centered world. …


Oceans

I got to sit down and speak with Pattie Maes this past Thursday.

Dr. Maes is legendary in the field of HCI. She’s currently one of the academic heads of the future-factory known as the MIT Media Lab, where she heads up the Fluid Interfaces Group and co-coordinates MIT’s Advancing Wellbeing initiative. “Fluid Interfaces” is a research/design group based on the premise that we’re moving towards a post-screen digital world, and there are an abundance of opportunities to redefine interaction between the real and digital worlds.

She’s best known publicly for her TED talk on the “SixthSense” prototype — a wearable AR device that projects contextually-relevant digital information onto physical objects (she was demonstrating this in 2009 — WAY before Google Glass, Microsoft Hololens, or Magic Leap). …

About

Aaron Faucher

UX/UI for mixed reality and positive computing. http://afaucher.me

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store