Supercon 2019: Kim Pimmel Offers a New Perspective on Immersive Tech
Kim’s project may be on rails, but don’t let that fool you into thinking you know where this is going.
The Hackaday Superconfernece returned for its fifth year in 2019, and the talks were better than ever. While my primary role involved prepping all of the slides for speakers, I did manage to sneak in a few as a member of the audience.
Among them was Kim Pimmel’s Beyond the Rectangle: Building Cameras for The Immersive Future. As someone who is unapologetically in love with all things VR and AR, this was an amazing experience. You can watch the entire talk online, and I recommend you do so, but before that, let’s talk about Kim’s unique take on capturing moments like never before.
A Combination of Creativity and Unique Approaches to Hardware
Kim Pimmel currently works as the AR design lead at Adobe Aero, but his prior experience includes working on the OS for Microsoft’s HoloLens AR headset.
His love for cameras and photography started back when he was 10 years old, using cameras with physical film that feel like relics compared to the technology we use today.
Back then, he learned to take pictures by carefully framing and preparing his shot. With limited funds at the time, he adopted what he calls a “one shot, one kill” mentality.
In particular, a camera from the 1950s really caught his eye. Known as the Stereo Realist, this was an early 3D camera that shot photos onto 35mm film by interleaving the stereo frames onto the film.
Rapidly flipping between the side-by-side photos achieves a 3D-like effect that you can see roughly four minutes into Kim’s talk. By combining this with macro lenses and some truly unique imagery, Kim was able to put together some stunning animations that found traction online.
All of this led him to work with Microsoft on the augmented reality headset known as the HoloLens. This was back during the early days of AR and VR, but it led Kim to start looking into emerging hardware that could combine this with his love for photography and film making.
The First (Doomed) Prototype
The rise of 360-degree cameras caught Kim’s eye, but they often lacked context and framing. He thought about the iconic bullet time scene from The Matrix where Keanu Reeves dodges bullets in an effect achieved through the use of a massive camera array.
All of this eventually led Kim to dive headfirst into a project where he would create an on-rails camera system. The journey began, as I’m sure you can imagine, with buying rails.
Kim chose curved rails on purpose, as he wanted to leverage the concept of a focal point in his proof of concept. Despite not having experience with hardware, or having the proper tools to model his carriage (he used Cinema 4D, which is typically for VFX), he pressed onward.
The first prototype came together with a GoPro Hero 4, along with some help from Adafruit and Arduino components, a belt system, and a DC motor, to create a piece of hardware that could capture at 120 frames-per-second while also moving along the linear rail.
The main issue with this first draft of the concept was a lack of stability in the images. There was far too much shaking and a lack of a consistent horizontal level, which wouldn’t allow for a smooth 3D image in VR or AR.
A New Type of Photography For a New Type of Viewing Experience
It was time for a new camera. Kim chose the Fujifilm W3 3D for this purpose because it took photos in a native stereo format using two lenses on the device.
The journey towards making a consumer device trigger remotely was another obstacle in the way, but Kim was able to reverse engineer the camera, mess with things like voltages and traces, and ultimately created a remote trigger hack for the device.
The new rail system swapped the DC motor for a stepper, and an EasyDriver board attached to an Arduino. Other quality of life improvements included a means of catching the carriage at either end of the rail, and a screen that could be used in tandem with a UI to make adjustments.
The final step was a wooden housing that would allow it to be portable and to sit on a tripod. Since the camera is able to take video, Kim showcased some shots of the stereo pairs that the camera captured.
Combining these into a viewing experience on an AR or VR headset allows to see a 3D image with depth that’s difficult to showcase on a flat screen. Kim likens the effect to “looking through a window.”
Taking The Concept Further
Since the effect here is difficult to appreciate outside of a VR or AR headset, Kim went on to propose new types of technology, such as a camera phone that could capture multiple points of light from different angles simultaneously via a strip of optics on the back of the phone akin to Lightfield technology.
One thing that bothered Kim was how you could move left and right, but not up and down. He started on his next project, which would allow a camera to capture images from more angles, so the viewer could see any perspective they wished.
Conceptually, Kim also pointed out that this could turn moments like the zoom and enhance scene from Blade Runner into a reality. Kim goes on to discuss cutting-edge technology in optics, and how companies are working to solve the issues of interactivity in VR.
Haptic feedback, locomotive solutions like the Virtuix Omni, and other experimental technology are solving parts of the problem. Kim refers to modern and mainstream screens as “The rectangle,” and the goal is to try and make this rectangle more immersive.
We’ve come a long way, but there is still much more we could do to bring the images we capture to life. With people like Kim working on solutions like his on-rails camera tool, I’d say the future of immersive technology will come into focus faster than we think.