HoloLens: Microsoft is Like That Weird Uncle That Always Had Cool Shit

Noah Norman
DAYONE — A new perspective.
15 min readJan 14, 2016

Today I got a chance to check out Microsoft’s HoloLens augmented reality headset at a developer demo a few floors above their flagship NYC store. I received an invite to try the thing out because I have a Microsoft developer account. I have a Microsoft developer account because I’ve done some creative work with the Kinect.

I posted another piece on Medium recently about the problems ahead for AR and have since been giving the topic a lot of thought. I’ve got a few more AR posts in the queue.

This post is as much about that developer demo event as it is about the hardware and the content Microsoft chose to present. I’ll justify as we go along.

There appears to be a segment of the Microsoft hardware dev team that has wizard-like powers. The Kinect 2 is excellent at what it does, and it does so at a price point (~$150 + a $50 Windows adapter) that betrays its loss-leader position in their Xbox ecosystem and has made it an indispensable tool for certain kinds of interactive art.

As you’ll read below, the HoloLens is a more-than-adequate seminal entry into the world of high-fidelity (hi-fi in contrast to the low-fi of Google Glass) augmented reality. Some smart decisions were made with the interface elements in the demos presented, and the hardware, as mentioned above, must be the creation of some kind of warlock, or shaman, or kahuna, or witch doctor, truly. Some strong medicine.

With all that said, it seems that Microsoft just can’t get out of their own way. The experience of attending this demo was cripplingly awkward, and I say this as someone who has just come back from CES, arguably the home of inconsistent eye contact and sweaty handshakes. That stuff is going to be a part of this post too, not just because it’s entertaining, but because it sheds some light on Microsoft’s marketing angle with the thing. That’s gonna matter when AR meets the public.

AR has some dorky optics to begin with. You look like an asshole using it — an issue the industry will have to overcome if AR is going to make it in the wild. If Microsoft is going to lead the second charge to bring it to the public (and they could, with their reach and this amazing untethered device), I’m afraid all of AR will be seen through their dweeby lens.

There are two puns in that paragraph.

Here’s the TL;DR:

HoloLens: AR’s answer to the Rift DK1. A proof of concept that makes augmented reality’s potential feel really real for the first time.

Microsoft’s presentation: belongs under this silo.

Having arrived just before my scheduled time of 2pm, I was directed to the second floor, checked in on a Surface, and told to hold tight as I’d be taken up at 2 on the button. And so it was at 2pm sharp — into the Wonkavator, and I was whisked into a world of cubicle partitions, awkward silences, and some solid demos for a piece of hardware whose design, I came to realize, could define the very language of AR for years to come.

The experience began with myself and another developer seated in a small room about four scant feet from a looming 100-plus-inch TV. Our earnest host kicked things off with a video explaining basic use of the HoloLens. The content did not require a jumbotron, by any means, and, to a degree, the panoramic experience of the how-to set the field of view of HoloLens itself up to pale in comparison.

‘Gaze • Gesture • Voice’, it explained, are how you interact with the device, and this is how you fit it to your head, or if you have a ponytail, or bangs, or glasses, this is how — the film was thorough in that regard.

Now here my heart sank a bit as the narrator revealed that we’d be using our eyes to move a circular reticle around, and twiddly-dinking our fingers in the air before us to ‘click’ on interface items highlighted by that reticle. ‘Mousing?’ I thought. ‘Mousing. In the future. We are taking the mouse with us.’

But fear not, as I did — the future is here — it’s just unevenly distributed. It’s not as bad as all that, and doesn’t one want something from our digital past? A soft landing, if you will.

When the exposition wrapped up, our host used a device I initially mistook for a Super Nintendo to measure my IPD, or interpupillary distance. I’m a 61.5, which puts me in somewhere around the 24th percentile. Is that good? I think it’s pretty OK—I’ve never had any complaints. He then wrote it down for me, with a real pen on real paper, which in retrospect seems kind of quaint considering the circumstances. This was my golden ticket, if you’ll allow me to flog the Wonka metaphor.

My IPD — I hope publishing this information doesn’t come back to bite me somehow.

Debriefed, IPD in hand, I was taken down a stretched-fabric cubicle farm hallway. Here I was asked to halt in a tight space dotted with digital prints featuring concept art of handsome, HoloLens-clad men. They were looking at exploded diagrams of phantasmagoric turbines— see-through spectral models that were floating, it seemed, inches above their desktops.

The sliding door before me was closed. I inferred my assigned meeting room was taken, and so I stood with my host in what was to be the second of many prolonged and awkward silences. It was eerily quiet, I now realized. I wondered briefly if this was an elaborate ruse to make off with my IPD and perhaps one of my kidneys.

After a few abortive attempts at making small talk with my guide, I found to take the staff off-script was to put a monkeywrench in the Pirates of the Caribbean ride that was this demo. I opened my mouth as if to speak and the man’s face fell before I made a sound. Promises were made that someone later would be able to speak extemporaneously, but, it was implied, now was not that time. Now was the time for shoegazing. I dutifully allowed my eyes to unfocus as I pretended to contemplate the wall coverings. I sniffed aloud to mark the time.

DEMO ONE and FIRST REACTIONS

After a few more minutes of crushing silence, I was invited into a small space for the first of three hands-on demos. There was grey commercial carpet on the floor, more prints — these of crop circles, a stool, a desk, a woman who didn’t introduce herself sitting at that desk, and a cloth-covered table at the middle of the far wall.

The docent who did greet me did so verbatim on-script, a fact I know for certain because nobody speaks like that in private to another person, ever. Think painfully-scripted casual like this HoloLens Minecraft demo from last years’ E3.

She explained that this first demo was meant to showcase HoloLens’ utility in the context of a product design presentation. I half-listened as I tried to decide if I should be making eye contact or not — it felt weird to meet her gaze as she delivered her speech, almost like staring into the dentist’s eyes as she digs at your gumline. This was the uncanny valley. It was like Sleep No More, but uptown, and less sexy.

With the headset fit as I was shown to do in the panoramic introduction session, under my ponytail, bangs swept aside (JK I don’t have bangs), my first glimpse of the future commenced.

First reactions: the active area of the display is small relative to your roughly 114-degree FOV. It looks something like a 32" monitor about 3' from your face.* While there are more than a few challenges to representing AR and VR experiences with traditional media, the first representation at MS’s E3 HoloLens + Minecraft demo event is pretty faithful to the perceived scale:

A minimally problematic representation of the HoloLens experience as the user sees it. Source.

*-update: a more firsthand-y source has corroborated my eyeballing, saying it’s the equivalent of a 15" monitor about 1.5' away.

Saturation and color gamut are just OK — the image looks a bit muted, but with decent blacks and good white intensity. Opacity is incomplete but appropriate for most AR. The ‘screen’ appears ‘flat’, and that because it’s really curved just so to appear that way.

The headset’s weight rests on a firm but adjustable hoop—one meant to clench your head like a civic crown—a hoop from which the rest of the apparatus can pivot freely without making contact on your nose bridge or anywhere else. It works, and it’s nice for once to experience futuristic facewear that doesn’t lean hard on my beak or squeeze my brain unduly.

After a moment of soupy and inertial auto-calibration, the holo-image floated into view. It was in focus and crisp (I had given the nice lady my IPD, in what would be our only unscripted interaction), but immediately I noticed the fluorescent light behind me was causing internal reflection off the display surface in a myriad of colors. This drew my attention to the birefringent sub-reflection of the display’s image, something like the effect of an oily, horizontal mirror just below the viewable area.

These ‘I can see behind me’ kind of reflections are not unique to AR glasses — they’re more or less inherent to any glasses that have to sit off your face by any amount. You’ve probably seen them in the sides of your RealD glasses before they dim the lights on a 3D movie. I see them all the time but I have a freakishly narrow head and I see dead people as well. YMMV.

As for the inverted birefringence, I don’t know what’s going on there because I have a degree in music. It’s distracting though.

After an interval, the MS logo I was looking at was replaced by an exploded diagram of a man’s wristwatch, writ fairly large at about 3' diameter, seemingly splayed out over the cloth-covered table across the cubicle from me. A voice piped up from behind my ears to explain what I was seeing.

The voice told me to walk around the room, this to underscore the principal thing that makes HoloLens different from most other display tech — it’s holographic.

That’s a statement that might get me into a low-stakes brawl in certain circles, as even the Wikipedia entry on holography seems ambivalent about HoloLens’ claim to such. Normally, I’m more than happy to split that particular hair, loudly, even, in the face of people experiencing joy at Tupac’s seeming resurrection, even, but in this case I’m not going to quibble — you can walk around the things, they have volume, can exhibit parallax and perspective, the whole deal—it’s close enough.

This is a picture of a real hologram. Kind of boring, right? Anyway, there are two kinds of pictures of holograms — lies and damn lies. This is probably just a candid of a mouse through BluBlockers. // Source: Wikipedia.

With the model live and poking out into the room, I noticed that the near clipping plane on the renderer is about arm’s length from your face — maybe a little less than 3' away. This probably has reason rooted in the pains of convergence arriving when you least expect it (I’m looking at you, early modern 3D movies), but, as I mentioned, I never finished that eye surgery correspondence course so what do I know?

Not being able to get really close or get that ‘damn I better duck’ feeling makes the spatial immersion come up a little short, even compared to 1986’s Coppola / Jackson / Disney classic, Captain EO, at which I most definitely ducked.

Sound on the HoloLens emanates from a pair of small and not particularly capable speakers embedded in the body just above your ears, where they downfire through a pair of grilled ports. Sound quality aside, what’s immediately salient is how much the binaural processing and 3D spatialization of the sound adds to the immersion of the experience. The voice beckoned me to put my ear by the watch, that I might hear its ticking. And I did. And it was good.

Headphones might do a better job of bringing you immersive, high-fidelity sound reproduction, but that’s the immersion-above-all else thinking of virtual reality. This is augmented reality, in which reality remains important, albeit not necessarily equally so, depending on the application. You have to be able to hear both the there and the not-there, and so open-ear headphones or speakers are the Right choice.

The rest of this demo took me through a few clunky MS-ribbon-y UX examples that one could use to annotate an existing exploded diagram or something like that.

My main takeaway from this bit was that ‘air tapping’ was pretty reliable, intuitive, and, after your first few tries, not all that embarrassing to do. Anyway, I kind of tuned out as much as you could within your first 90 seconds of experiencing a revolutionary technology.

But then the lady told me to look to my right, and damned if it wasn’t Slam Man standing over there with a nice tight laser beam coming out of his eyes, just boring right into that big old watch.

Artist’s rendering. I didn’t get a good look at the guy but this is more or less what was happening over there.

This was meant to represent what you might see when looking at another HoloLens user in a certain context — you can actually see one another’s gaze represented as a sort of AR laser pointer. What’s more, you can visualize your model with a false-color heat map texture that represents other users’ gaze dwell time. I’m no marketer but I can see the value in that for any number of applications (industrial design, architecture, construction, collaborative modeling, hell, even the ‘arts’). My gears turned. Watch pun.

Incidentally, my brief experience with Slam Man made me realize that AR displays with a narrow field of view incline you to ‘check out’ newcomers in a comically stage-broad way.

DEMO TWO: MAKING IT DO WHAT IT DOO-DOO

After the watch demo and a quick bit about the solar system (the point there being that the stars went on for ever), the nice robot lady took the headset back and said her pre-programmed goodbye. My guide from the hallway then took me to the adjacent room, where I met another nice young person who probably has a great personality that they are never, ever to show to a customer under any circumstances.

This demo was where they really sell it. Here, using your voice, you tell the device to initiate a ‘spatial scan’, (the other kind of SLAM, or perhaps more accurately, a ‘surface map’), and then proceed to watch that trite computer vision effect — the one with the grid swooping over everything in the room — happen IN THE REAL WORLD as you slowly turn around in a circle.

The grid in question is more of a Voronoi / Delaunay / subdivision surface thing than a grid, but you get the idea. The SLAM (can we please call it a SLAM?) saw everything — the bookcase, the couch, the stool, the walls, the floor, even my guide where he was seated in the corner. My inner McMahon said ‘HIYO!’ Out loud, and involuntarily, I made a dumb sound.

This was the setup for a series of UX demos showing how to place, texture (‘spraypaint’), transform, rotate, clone, and delete objects in a 3D scene.

They began with showing me how to move a 1:1 model of a some kind of virtual sign for billiards (why that? why not) around the various surfaces of the room. I plopped it on one of the couch pillows, at which point my guide explained that this demo was to show how useful the system is for understanding the scale of models—models you might 3D print, and thereupon he pointed out that that same billiards sign was right there in meatspace, at the same scale, on the bookcase in front of me. Fair enough.

The next of each of these 3D-modeling-lite demos came with an intuitive voice command or a corresponding palette item from a virtual toolbox I had been instructed to slap on the wall behind me. That sentence kind of reads crazy but in AR, crazy reads you.

Most of the interactions were driven by a combination of terse voice commands, reticle-pointing, and air-clicking, making the operations feel immediate and intuitive, albeit not so much new as familiar. Baby steps.

Notably, there was a command for ‘life size’, which blew up the little tropical scuba scene I was mucking about in to full-scale, and then toggled it back to tabletop size. I had made that parrotfish a little big, as it turns out, but the point was taken—in AR, ‘life size’ means life size, and if scale matters, that experience is incomparable with the drinking-straw perspective you get from the tools available to you now.

Altogether, though, this was less Photoshop than MS Paint, if you catch my drift. I wasn’t ‘modeling’, per se, so much as manipulating a few pre-existing objects, but with some simple tools a la Sketchup or 123D Sculpt, I could have been cooking with gas quick.

It works, and if you work in 3D, especially 3D where scale is important, this could represent a significant shift in the way you experience your product before the (time consuming, sometimes expensive, often irrevocable) transition from digital to physical.

DEMO THREE: PEW PEW

Last up on the demo circuit, in a third and final cubicle, a nice lady told me in a rote but animated way that I was about to play the FPS they call Project X-Ray. It’s this one — you’ve probably seen the stage demo from an early HoloLens press event.

The game uses the Xbox controller, which felt comfortable, if not as novel as what’s going down in this Magic Leap concept video that we all took with a serious grain of salt. The short of it is: a bunch of robots bust through the (real) walls around you and you shoot them, dodge their slow-moving blaster blobs, and occasionally use bullet-time and x-ray vision as a superpower.

It was like this but I was better dressed. And they didn’t let me behind the couch. And I don’t remember seeing a blaster on my hand. And I was holding an Xbox controller. Source.

It was really fun, although I was too cool to let on in the moment, and, in this case, the narrow FOV was designed into the experience, making the player rely on the strong spatial sound and the danger indicator near the aiming reticle. Behind some good play balance, the tiny view served to heighten the drama.

I finished with a score of about 13,000, and the docent gamely delivered her scripted line about how well I did.

THE FOLLOWUP. THE HARD SELL. THE COLD SHOULDER.

With the demos behind me, I was excited to sit down with some heads from the MS dev team and talk shop about how this thing can be used. What are its limitations, I wondered, and how can they be made into strengths? What features are available in the SDK as it is now and what is the roadmap going forward? What has the development experience been like so far?

At that point, the guide took me into a small conference room and introduced me to a Surface tablet, where he said I could fill out a survey and/or look at the HoloLens website to make a pre-order. The survey asked me how likely I am to purchase a device or tell a friend. I was noncommittal on both (6/10), and I declined to peruse the website at that time.

There were no experts, no other developers, no people free to speak their mind and share a human experience — to bring me into the fold— to convert me — right at this key moment, when I was about as amped as I’d ever been on a Microsoft product. This was the softest sell of all time.

Survey completed and back out into the cube farm with my stalwart guide, I was ushered in silence back to the entry for my last bit of human non-contact, standing quite still as we avoided one another’s eyes and waited for the elevator to take me back to reality.

As an introduction to augmented reality, you couldn’t ask for much more than the HoloLens hardware. It works just well enough to give you the feeling that something akin to magic is afoot.

As a handful of tech and UX demos, the experiences Microsoft are showing off succeed in telling a few different stories, offering a soft landing, and hinting at a world of possibility behind the curtain.

As a marketing experience, albeit one geared towards their own cadre of developers, this made me hope that Microsoft get some outside help when it comes time to introduce HoloLens, and, by extension, augmented reality, to the world, because right now it has all the style of a tax return.

Copyright Noah Norman, 2016. Originally published at hardwork.party on January 14, 2016. Follow Noah on twitter at @doctorhandshake and @hardworkparty.

--

--