Haptics Researchers at Penn Make Virtual Reality Tangible
by Evan Lerner
After being in the public’s science-fictional imaginings for decades, virtual reality finally seems poised to enter the mainstream. The world’s top tech companies, including Google, Microsoft, Samsung, and Facebook, are ramping up their investments in the technology, and long-awaited virtual reality headsets, such as the Oculus Rift, are now becoming available to consumers.
With so much interest in seeing virtual worlds and objects, haptics researchers at Penn are making sure people can touch them, too. The engineering discipline revolves around capturing and replicating tactile information.
Researchers recently gathered at the 2016 IEEE Haptics Symposium in downtown Philadelphia, where they presented research, demonstrated new hardware, and shared ideas with colleagues from around the world.
Penn Engineering’s Katherine J. Kuchenbecker, the Class of 1940 Bicentennial Endowed Term Chair Associate Professor in the departments of Mechanical Engineering and Applied Mechanics andComputer and Information Science, was the symposium’s co-chair.
The symposium was a reunion of sorts; in addition to Kuchenbecker’s current undergraduate and graduate students and post-doctoral researchers, many of the presenters were alumni and former visiting scholars in her lab.
“It’s really gratifying to see the seeds of our work being sown around the world,” Kuchenbecker said.
One research team from her lab — postdoc Jeremy Brown, grad student Mary Ibrahim, and undergrad Elyse Chase — presented an interactive demo of their haptic devices for robotic surgery.
Minimally invasive surgical techniques involve sliding tools and cameras into keyhole-sized incisions, which surgeons operate through virtual reality-style interfaces. While these high-tech techniques cut down recovery time, they can make routine, low-tech aspects of surgery, such as feeling for changes in stiffness that signal the boundaries of a tumor, much more difficult.
The same fingertip-shaped sensor was connected to three types of feedback devices, each with a different number of touch-simulating actuators. Visitors could try pressing the sensor into model tumors and compare the sensations from each device.
“It might seem obvious that four actuators would be better than three,” said Brown, who also presented a recent paper on their research at the symposium, “but it could also be that it doesn’t make a difference and there’s no reason to make the device more complex.”
The researchers are currently looking at quantitative data on how the different devices translate the signals from the sensor to the actuators.
“Ultimately, we want surgeons to try each of these devices in a human subject experiment,” said Ibrahim, “then get both objective measures of how well they completed a task, and which of the devices they felt accomplished the task the best.”
Evan Lerner is a science news officer in Penn’s Office of University Communications