This article summarizes a paper authored by Parastoo Abtahi, Benoit Landry, Jackie Yang, Marco Pavone, Sean Follmer, and James Landay. This paper will be presented at CHI 2019, on Wednesday, May 8th 2019 at 2:40 PM, in the session “Mapping and 3D”, in room Boisdale 1.

Image for post
Image for post
Left: user touching a spaceship in the virtual world. Right: to provide the sensation of touch, a swarm of drones search the real environment for an everyday object with a similar shape and present that object to the user.

Imagine a future …

where drones are smaller, quieter, safer, and more affordable. Now imagine that you own a few of these drones, with a gripper mechanism attached to them so they can pick up different objects. You wear your Virtual Reality (VR) headset and enter a virtual world. The drones begin to scan the room in search of everyday objects or 3D printed props that resemble the shape of the virtual objects in the scene. When a match is detected, the closest drone picks up that object. As you reach out to touch a spaceship in VR, and before you make contact with it, the system predicts what you’re about to touch. …

Parastoo Abtahi

CS/HCI PhD Student at Stanford University

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store