Published in


The eyes have it

Last Thursday, 21 July 2017, I had another one of those milestone moments that will stay with me forever. A photorealistic avatar turns towards me and makes eye contact in VR.

In a non-descript room at the University of Sydney I was sitting on a low chair wearing a Vive headset and having a conversation with possibly the most realistic human CG head ever created. As part of his digital human PhD, Mike Seymour had his head and eyes scanned, and in collaboration with a star-studded line up of tech companies led by Epic Games, oversaw the painstaking creation of “digital Mike” — an elaborately rigged and textured CG head that represents the latest developments in digital human research.

You have probably seen high resolution CG heads before but this one is different. This is not pre-rendered video. Mike’s head is being rendered live — this is a realtime VR experience produced from a game engine. Across the room, Mike is puppeteering digital Mike from behind a screen. His every head movement and facial expression is streamed across the network, realtime animation and audio data, to PCs rendering his on-screen double using a custom build of the Unreal Game Engine. Wearing my HMD I’m not aware of any of this. After a few minutes, it just feels like we’re having a conversation , face-to-face but online, without video and it feels real enough to me. It’s a taste of what high end digital productions can do in 2017 and what mainstream consumer experience might look like in the near future.

I’m helping Mike stage this as a public experience at the Los Angeles Convention Centre for #SIGGRAPH2017 but today there’s just the four of us in the lab. Mike in the interviewer’s chair (wearing a Technoprops facial mocap helmet rig — made famous from the Making Of clips of Avatar), myself in the interviewee’s chair, Stephen monitoring the facial mocap solver and Hamish monitoring the TV spectator view of the show. My avatar, the one that Mike sees on his screen is a less complex but also an impressive creation. Loom.AI provides a web service for generating an avatar from a single photo. At SIGGRAPH this lower resolution avatar appear in front of the real Mike onscreen with realtime mouth and eye tracking. Having seen the possibilities, I’m quite tempted to take a drill to Mod’s Vive headset and install some cameras. The eyes really have it.

This isn’t the first milestone moment I’ve had at Sydney Uni. Just across City Road from here is where I cobbled together one of Australia’s first websites. It was really hard to describe how I felt back then. I’m going to have a better crack at describing this one. This was a moment that sent a chill down my spine. I have absolutely no doubt that anyone born this week will find it hard to get very excited about this project. But the global effort to get to this point across so many different areas of arts and science are truly staggering. Last Thursday I reckon another little piece of digital cultural history fell into place and I was chuffed to witness it. For now I park this post and wait for the big reveal at SIGGRAPH. I can’t wait to watch the crowd this week take it all in as they #MeetMike.



Observations, reflections, predictions and directions on VR, AR and mixed reality from people who have been thinking about & making virtual reality for over 20 years.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store