Humans and robots go hand in hand

Purdue College of Engineering
Purdue Engineering Review
4 min readJan 20, 2023

“No man is an island, entire of itself,” the English poet John Donne wrote. The same could be said of robots — they are not ultimately stand-alone solutions. Just as humans need their mechanical and computational assist, robots need the flexibility and safety awareness of humans, as the two increasingly share a workspace in physical proximity.

That calls for research at the intersection of mechanical design, sensory perception, and dynamic control, as well as more investigation into human-safe collaborative robots (cobots), soft robotics, and robotic manipulation. The goal is to develop safe, high-performing robots that can work in dynamic environments beside us to perform challenging tasks.

Conventional industrial robots — widely deployed in factories to build cars, for example — are fast, strong and precise. However, they are not able to deal with uncertainty and dynamic occurrences in human-shared environments. Under some circumstances, they even can cause severe injuries and damage, due to their rigid and heavy material properties.

In contrast, soft robots, because they deform upon impact, offer inherent safety to the surrounding environment, yet may suffer from limited performance due to their hyper-elastic properties.

My vision is to integrate both rigidity and flexibility to develop safer, higher-performance machines. As an example, I have developed variable-stiffness robotic arms and hybrid rigid-soft robots to take advantage of the qualities of what typically are either solely rigid or solely soft robots.

But even an advanced mechanical design is not enough to perform complex manipulation tasks. A robust perception system also is required to identify the state of the machine and the environment. Tactile sensing is especially critical because it is the way for robots to physically understand the world. Vision-based tactile sensing, through the rich information captured by embedded cameras, could be one solution for estimating complex touch information for robots.

Two dexterous robots equipped with 2-finger parallel grippers and vision-based tactile sensors for tactile-enhanced manipulation tasks in Purdue’s MARS lab. (Purdue University photo/MARS lab)
In Purdue’s MARS lab, a pair of dexterous robots on the left performs a hand-over task, while a pair of collaborative robots on the right performs dynamic manipulation tasks. (Purdue University photo/MARS lab)

In my research, I integrated embedded vision sensors into hybrid rigid-soft robots to achieve high-resolution and high-dimension proprioception (the ability to sense movement, action and location) and exteroception (the faculty to sense the surrounding external environment) simultaneously, in order to better perform these complex manipulation tasks.

Finally, control strategies are essential for intelligent autonomous machines so they can be successful in an autonomous environment. That calls for powerful computational and algorithmic elements. My strategy, leveraging the advanced hardware platform and embedded vision sensing, is to apply both model-based and data-driven methods for such complex robotic manipulation tasks as deformable object manipulation, thereby overcoming a huge hurdle for traditional robotic systems.

Much of this work is being pursued in the Mechanisms And Robotic Systems (MARS) lab at Purdue, as well as through affiliation with Purdue’s Institute for Control, Optimization and Networks (ICON). The institute’s mission is to integrate diverse, multidisciplinary expertise to address core challenges, like these in robotics, in complex, connected and autonomous systems. To pursue their innovations, some 75 ICON-affiliated faculty collaborate across more than 12 Purdue departments, as well as with industry, government agencies, and leaders and experts in the field.

An institute like ICON, with wide-ranging faculty knowledge, is critical for modern research. For instance, developing a robotic system requires researchers to draw on computer vision for perception; control and planning for manipulation and locomotion; mechanical design for hardware; and machine learning and artificial intelligence (AI) for analytics and decision making.

ICON organizes in-person weekly seminars that allow me to meet and talk to faculty with these and other backgrounds and discuss potential collaborations. In addition to linking faculty within the Purdue campus, ICON actively connects our faculty with industrial and federal partners. For example, it recently hosted a meeting between Saab leaders and Purdue faculty, and held a summit attended by many Department of Defense program managers.

No researcher is an island either. These kinds of interdisciplinary collaborations — in this case, at the crossroads of mechanical design, sensory perception, and dynamic control in robotics — promise to have a considerable impact in a wide range of sectors, including industrial manufacturing, household tasks, health/elderly care, home/hospital nursing, and grocery shopping, as well as on overall quality of life in the home and the workplace.

Yu She, PhD

Assistant Professor of Industrial Engineering

Director, Mechanisms And Robotic Systems (MARS) lab

Faculty Contributor, Institute for Control, Optimization and Networks (ICON)

Faculty Contributor, Autonomous and Connected Systems (ACS) Initiative

College of Engineering

Purdue University

--

--