Intel
Technology’s Next Dimension
6 min readDec 17, 2014

--

The future of human-computer interaction

Our reality is shifting. The distinction between the real and virtual is blurred. The Web is no longer a place we go to through our laptop or desktop computers but rather woven into the fabric of our lives.

For computers to truly become extensions of ourselves though, we need a new way to bridge the gap between us and them. In short, we need interfaces that make sense to humans. We don’t lack for ideas, as evidenced by Hollywood’s imagination of future computing. Captain Kirk talks directly to the Enterprise’s omnipresent computer. Apple’s Siri was a step toward that vision, albeit without the flirtation (for now). In Minority Report, Tom Cruise operates his computer through a gestural dance, a scene that inspired numerous Microsoft Kinect demos.

(C) 2002 DreamWorks Pictures / 20th Century Fox

“The devices we’ve been using for the last 30 years have constrained our input to typing and mousing,” says Paul Tapp, senior product manager in Intel’s perceptual computing division dedicated to bringing human-like sensing to personal computers. “Keyboards and mice perform their function very well but they’re not very natural.”

Tapp is standing in an Intel research laboratory where several desktop and laptop computers are outfitted with the fruits of Intel’s multi-year committment to perceptual computing, a collection of hardware and software called Intel® RealSense™ technology. The first core product is the Intel RealSense 3D camera, a pair of eyes for your machine, and some very smart computer vision software that makes it all work.

With Intel RealSense technology integrated into our desktop, laptop and mobile devices, our computers can recognize who we are, deduce how we are feeling, and interact with us in natural ways. We can draw in mid-air and see it appear on our screen, collaborate more intimately with colleagues half a world away, and digitize physical objects and bring them online in an instant.

Intel is demonstrating those use cases today. The RealSense 3D camera will be shipping on Ultrabook devices, notebooks, 2 in 1s, and all-in-one PCs from a variety of brands next year, and the software developer kit is now available.

But these are early days. As the Internet of Things comes alive — from connected cars to plants that Tweet when they’re thirsty—perceptual interfaces will be essential for us to engage with these pervasive intelligent systems, and for the devices to understand our needs.

“Not only can our interaction with computers step away from the traditional two-dimensional keyboard and mouse, but robots, drones, and many other devices could use Intel’s RealSense technology to see and think,” says Intel CEO Brian Krzanich in a special issue of Intel’s iQ publication, titled “Science Of Seeing.”

While Intel RealSense technology is undoubtedly a giant leap in human-computer interaction, it is really an evolutionary step for the engineers inside Intel’s perceptual computing laboratory. For a decade, Intel’s researchers have been at the forefront of proactive computing, where computers anticipate human needs and, if necessary, act on our behalf.

This paradigm extends the notion of computers far beyond the desktop or laptop and into sensors, embedded processors, and wireless networks. Intelligence is embedded in every nook and cranny of our physical world, from our homes to our cars to our bodies.

It is what Intel futurist Brian David Johnson calls the “ghost of computing,” where processors become so small that they essentially disappear while simultaneously empowering us with their capabilities to enhance our lives. A grand vision to be sure, and closer than one might think. But it’s not without its challenges, specifically around how we tap into those computing resources in ways that feel human, that put us at the center of the experience.

“That’s the goal of RealSense, to take our devices from being trusty workhorses waiting for us to engage with them and make them understand us and work with us,” Tapp says.

It started with giving our machines ears. Several years ago, Intel worked with personal computer manufacturers to integrate multiple microphones, speech recognition, and natural language processing into laptop and desktop computers. The software comes from Nuance, the company whose voice recognition also powers Apple’s famed Siri. The combination of the software multiple microphones, and Intel’s processors enables the computer to “focus” on the essential sounds in a room, filtering out the rest. Just like humans.

“It’s called listening, and humans do it naturally,” Tapp says. “In most environments, there are sounds and other conversations all around us and we just ignore them.”

Now, when we want something from our computer — to search for a flight or play a particular song — we just need to ask.

The next challenge was exponentially harder. Could a computer learn to see? Webcams are nothing new. But they lack a key characteristic of human vision that most of us take for advantage: We have two eyes that enable us to see in three dimensions. This year, Intel unveiled the RealSense 3D camera, a device the size of a stick of chewing gum that integrates two cameras, infrared lasers, and custom processing chips.

“You can start doing things like see where someone is moving, the position of your hand, the emotion on your face, you can interpret body language,” Tapp says. “We humanized the vision capability of our devices.”

The story behind the RealSense 3D camera’s exquisite engineering is told in the article “From Lab to Reality” in Intel’s iQ publication. The applications of the technology must be seen to be believed, from games where characters are controlled by moving your fingers, to Intel RealSense Snapshot that makes it possible to adjust your photos’ focal point or measure objects in the frame after the face. It can even be the basis of a dynamic new musical interface where you literally mold the sounds with your fingertips.

Perhaps most striking though is the feeling of intimacy that Intel RealSense technology brings to the computer and the person operating it.

In one demonstration, you stare into the screen and the computer instantly recognizes if you’re smiling, frowning, or surprised. If you’re confused or upset, your operating system could offer help before you even think to ask.

In a next-generation video chat app using Intel RealSense technology, your background dissolves and your image is placed on the screen within the Web page you’re remotely sharing with a someone else. Point out the best route on a map by running your finger along it, or watch funny cat videos with a faraway friend in a much more immersive way.

“We can experience the content together,” Tapp says. “Games become much more social and interactive. We can laugh together.”

And someday, the computer may laugh right along with you.

What if your computer could see like you? A special issue of iQ Science of Seeing. See the stories.

--

--

Intel
Technology’s Next Dimension

Intel news, views & events about global tech innovation.