Apple bought an eye-tracking company — what’s next?

The following blog post was inspired by a conversation with fellow Super Ventures partners Ori Inbar and Mark Billinghurst as we discussed how Apple could use the eye-tracking tech it reportedly acquired from SMI

Photo Source: SensoMotoric Instruments

Apple has acquired SensoMotoric Instruments, an eye tracking company out of Germany according to a recent report by MacRumors. SensoMotroic Instruments (or SMI) was founded in 1991 as a spin-off from academic and medical research at the Free University of Berlin. The company specialized in eye-tracking hardware and software solutions especially for headworn devices and recently rolled out eye tracking support for Virtual Reality (VR) head mounted displays, including the HTC Vive, Oculus Rift and Samsung Gear VR.

SMI’s acquisition is one of many AR+VR-related startup purchases by Apple over the past few years including Metaio, PrimeSense, FaceShift and FlyBy Media. These companies have been acquired for talent and IP all with the aim to equip Apple with the ingredients it needs to compete in this next wave of computing.

We have already seen how Apple has utilized Metaio and FlyBy Media in Apple’s huge move into augmented reality on iPads and iPhones with ARKit. Rumors suggest that the PrimeSense IP will come into play with the upcoming iPhone 8 which is expected to have depth sensing technology.

So how could we see SMI’s technology be used?

Apple is readying AR Glasses

The most obvious answer is that Apple will use this technology in an upcoming pair of AR glasses. There have been many rumors about Apple’s foray into the headworn space and the acquisition of SMI certainly adds further fuel to the fire.

AR smartglasses require a combination of tracking, display and interaction technology to enable users to see and interact with AR content fixed in space. For example, the Microsoft HoloLens AR system uses depth sensors for room scale inside out tracking, an optical see-through display, and camera input to support simple gesture interaction. With Apple’s previous purchases they have access to world class tracking technology as shown by their release of the ARKit library and videos appearing of a variety of AR objects being fixed in space, even over building scale.

On the input side, the PrimeSense sensor was originally used in the X-Box and can be used for natural gesture and body tracking. Purchasing SMI will allow Apple to add eye-tracking input to smartglass interfaces. This is important because unlike most other input modalities, eye-tracking supports implicit rather than intentional input. People sub-consciously gaze at objects (implicit input) before consciously reaching out to interact with them (intentional input). Eye-tracking can also reveal a lot about the user’s focus of attending, and cognitive load (through pupil dilation).

Overall the SMI purchase provides the opportunity to create extremely natural user interfaces for smartglass systems. The shift of displays from the hand to the face requires new user interactions and eye-tracking could be a critical modality in helping us interact with a headworn device more naturally than tapping in the air or swiping a finger on the arm of the frame. Google knows this already which is why they purchased Eyefluence late last year, although Eyefluence technology is mainly used for interaction with 2D user interface elements, rather than the true 3D eye-tracking provided by SMI.

SMI has used its eye-tracking tech in smartglasses as a way to interact with headworn content. In 2016, the Lumus DK-50 AR glasses were demoed at AWE with SMI’s eye-tracking cameras and advanced software to create a solution that is highly reactive to the wearer’s gaze. SMI had previously partnered with Epson to demonstrate hands-free interaction at Siggraph 2015 and earlier that year showcased the benefits of eye-tracking on Google Glass.

SMI’s eye-tracking has also been used to select menus and interact with content on a Samsung Gear VR, illustrating a future when “eye-tracking in VR is standard” and was one of the few companies (along with Tobii and PupilLabs) to offer eye-tracking as an add-on to other existing VR head mounted displays such as the HTC Vive. The addition of eye-tracking to VR enables a variety of interesting eye-gaze based interaction techniques such as those recently shown by the Empathic Computing Lab at the University of South Australia.

In addition to using eye-tracking as a method for input, SMI‘s tech solves another major problem in AR+VR — foveated rendering. SMI has pioneered development on foveated rendering, a technique which uses eye-tracking to reduce the rendering workload of a headworn device by greatly reducing the image quality in the peripheral vision (outside of the zone gazed by the fovea). At the beginning of last year, SMI published a video showing its tech on the Oculus Rift DK2 delivering foveated rendering at 250Hz and later on mobile VR using the Samsung Gear VR.

SMI’s eye-tracking has also been used as a measurement tool by analyzing and providing insights on what a user is looking at. The ability to monitor and measure this user behavior could be extremely useful to Apple in the R&D phase of headworn content which Apple most likely be working on for the launch of its AR glasses. These same tools could also be used to equip Apple developers with powerful AR+VR analytics which will not only help make content better but also unlock new monetization methods such as eye-tracked advertising.

Eye-tracking is coming to future iPhones, iPads and MacBooks

One possible shape SMI tech may take in the Apple ecosystem is equipping next generation MacBook, iPhone and iPad devices with eye-tracking for both interaction and data collection. While SMI is most known for its headworn eye-tracking tech, it also offered a lightweight portable eye-tracking bar that could be placed on a laptop or computer monitor to track the eyes of a user.

SMI’s tech on existing products could also reinforce rumors that Apple is considering using facial scanning as a new form of authentication. It could also be used in conjunction with FaceShift to turn users into highly realistic digital avatars in applications such as FaceTime and iMessage as SMI showed in the video below entitled “The SMI Social Eye”. Face and eye-tracking are natural complementary technologies that will enable very lifelike virtual copies of users to appear on desktop and mobile.

The Apple Car will watch you

But perhaps it is in Apple’s “mother of all AI projects”, the Apple Car, which SMI will show up in first. In 2015, SMI and DEWEsoft introduced a turnkey platform for Driver Machine Monitoring and Analysis Platform. Eye-tracking can enable semi-autonomous vehicles with the information it needs about the driver to keep them safe by activating cruise control when the driver is tired. Eye-tracking can also be used as an input to control a digital heads-up display on the windshield as SMI competitor Tobii has demonstrated.

Apple wants to give Siri eyes in your home

Apple’s most recent product announcement of HomePod begins to put Siri in your home as your virtual assistant in order to compete with the likes of Google Home and Amazon Echo. While it is currently just a speaker, Amazon’s Echo Show, which features the addition of a touch screen and camera, may allude to the future of our home assistants which aren’t just listening but are watching us to serve us better. If this is the case, a HomePod with an integrated eye tracker could be used to help virtual assistants understand when you are looking at them and even perhaps how you are looking at them to respond in a much more “human” manner.

The iBrain is coming?

SMI also has strengths in the Brain Computer Interface (BCI) area, in which EEG sensing can be used to enable computers to respond to brain activity. The BCI space is heating up and SMI may help Apple get into this race. Earlier this year both Mark Zuckerberg and Elon Musk announced plans to tap into the power of the brain as a new computer interface. SMI’s multi-modal software solution does support EEG as an input and the team has demonstrated its platform with brain-sensing devices such as Emotiv. This could be valuable for Apple who may be thinking ahead to when we move past the use of our hands, voice and eyes to control the digital world with our thoughts.

Whatever form SMI’s team and IP take in Apple, it is clear that the company continues to be committed to push the limits of computing. We will all be watching.

Thanks to Mark Billinghurst and Ori Inbar for the great chat and for contributing to and editing this post.