RT/ Trotting robots reveal emergence of animal gait transitions

Published in
24 min readMay 14, 2024


Robotics & AI biweekly vol.94, 29th April — 14th May


  • A quadruped robot, trained using machine learning, adeptly avoids falls by seamlessly transitioning between walking, trotting, and pronking — a milestone for both roboticists and biologists studying animal locomotion.
  • Researchers have utilized robotic assistance effectively in crafting wind turbine blades, thereby eliminating harsh working conditions for humans and potentially enhancing product consistency.
  • An elastic electronic skin may furnish robots and devices with human-like softness and tactile sensitivity, enabling precise tasks requiring control over force.
  • A robot employing machine learning now fully automates a complex microinjection process crucial in genetic research.
  • An innovative algorithm encourages robots to gather more diverse data by moving in a more random fashion, leading to improved safety and practicality in self-driving cars, delivery drones, and beyond.
  • Soft skin coverings and touch sensors offer promising prospects for safer and more intuitive human-robot interaction, though their production remains costly and challenging. However, a recent study demonstrates efficient manufacturing of soft skin pads doubling as sensors, utilizing thermoplastic urethane and 3D printing.
  • Researchers have created miniature, flexible devices capable of encircling individual nerve fibers without causing damage, potentially aiding in the diagnosis and treatment of disorders like epilepsy and chronic pain, as well as enhancing prosthetic limb control.
  • Utilizing pliable soft materials for collaboration with humans and disaster relief has garnered significant attention. Yet, practical application remains challenging. Researchers have devised a method to control pneumatic artificial muscles, exploiting their rich dynamics as a computational resource.
  • A robotic simulation training program has been developed to offer trainee physicians additional practice in placing central lines, significantly reducing complications such as mechanical issues, infections, and blood clots.
  • Scientists have engineered a novel robotic suction cup capable of securely grasping rough, curved, and heavy stones.
  • And more!

Robotics market

The global market for robots is expected to grow at a compound annual growth rate (CAGR) of around 26 percent to reach just under 210 billion U.S. dollars by 2025.

Size of the global market for industrial and non-industrial robots between 2018 and 2025 (in billion U.S. dollars):

Size of the global market for industrial and non-industrial robots between 2018 and 2025 (in billion U.S. dollars). Source: Statista

Latest News & Research

Viability leads to the emergence of gait transitions in learning agile quadrupedal locomotion on challenging terrains

by Milad Shafiee, Guillaume Bellegarda, Auke Ijspeert in Nature Communications

With the help of a form of machine learning called deep reinforcement learning (DRL), the EPFL robot notably learned to transition from trotting to pronking — a leaping, arch-backed gait used by animals like springbok and gazelles — to navigate a challenging terrain with gaps ranging from 14–30cm. The study, led by the BioRobotics Laboratory in EPFL’s School of Engineering, offers new insights into why and how such gait transitions occur in animals.

“Previous research has introduced energy efficiency and musculoskeletal injury avoidance as the two main explanations for gait transitions. More recently, biologists have argued that stability on flat terrain could be more important. But animal and robotic experiments have shown that these hypotheses are not always valid, especially on uneven ground,” says PhD student Milad Shafiee, first author on a paper.

Proposed learning architecture.

Shafiee and co-authors Guillaume Bellegarda and BioRobotics Lab head Auke Ijspeert were therefore interested in a new hypothesis for why gait transitions occur: viability, or fall avoidance. To test this hypothesis, they used DRL to train a quadruped robot to cross various terrains. On flat terrain, they found that different gaits showed different levels of robustness against random pushes, and that the robot switched from a walk to a trot to maintain viability, just as quadruped animals do when they accelerate. And when confronted with successive gaps in the experimental surface, the robot spontaneously switched from trotting to pronking to avoid falls. Moreover, viability was the only factor that was improved by such gait transitions.

“We showed that on flat terrain and challenging discrete terrain, viability leads to the emergence of gait transitions, but that energy efficiency is not necessarily improved,” Shafiee explains. “It seems that energy efficiency, which was previously thought to be a driver of such transitions, may be more of a consequence. When an animal is navigating challenging terrain, it’s likely that its first priority is not falling, followed by energy efficiency.”

Training scheme.

To model locomotion control in their robot, the researchers considered the three interacting elements that drive animal movement: the brain, the spinal cord, and sensory feedback from the body. They used DRL to train a neural network to imitate the spinal cord’s transmission of brain signals to the body as the robot crossed an experimental terrain. Then, the team assigned different weights to three possible learning goals: energy efficiency, force reduction, and viability. A series of computer simulations revealed that of these three goals, viability was the only one that prompted the robot to automatically — without instruction from the scientists — change its gait.

The team emphasizes that these observations represent the first learning-based locomotion framework in which gait transitions emerge spontaneously during the learning process, as well as the most dynamic crossing of such large consecutive gaps for a quadrupedal robot.

“Our bio-inspired learning architecture demonstrated state-of-the-art quadruped robot agility on the challenging terrain,” Shafiee says.

The researchers aim to expand on their work with additional experiments that place different types of robots in a wider variety of challenging environments. In addition to further elucidating animal locomotion, they hope that ultimately, their work will enable the more widespread use of robots for biological research, reducing reliance on animal models and the associated ethics concerns.

Toolpath generation for automated wind turbine blade finishing operations

by Hunter Huth, Casey Nichols, Scott Lambert, Petr Sindler, Derek Berry, David Barnes, Ryan Beach, Sahand Sabet, David Snowberg in Wind Energy

Researchers at the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL) have successfully leveraged robotic assistance in the manufacture of wind turbine blades, allowing for the elimination of difficult working conditions for humans and the potential to improve the consistency of the product.

Although robots have been used by the wind energy industry to paint and polish blades, automation has not been widely adopted. Research at the laboratory demonstrates the ability of a robot to trim, grind, and sand blades. Those necessary steps occur after the two sides of the blade are made using a mold and then bonded together.

“I would consider it a success,” said Hunter Huth, a robotics engineer at NREL and lead author of a newly published paper detailing the work. “Not everything operated as well as we wanted it to, but we learned all the lessons we think we need to make it meet or exceed our expectations.”

The coauthors, all from NREL, are Casey Nichols, Scott Lambert, Petr Sindler, Derek Berry, David Barnes, Ryan Beach, and David Snowberg.

Robot cell with a band saw attachment and a 5-m test blade section (top left), Pushcorp grinding tool with custom dust collection shroud (top right),11 Tyrolit sanding tool (bottom left),12 and the Zivid II structured light camera (bottom right).

The post-molding operations to manufacture wind turbine blades require workers to perch on scaffolding and wear protective suits including respiratory gear. Automation, the researchers noted, will boost employee safety and well-being and help manufacturers retain skilled labor.

“This work is critical to enable significant U.S.-based blade manufacturing for the domestic wind turbine market,” said Daniel Laird, director of the National Wind Technology Center at NREL. “Though it may not be obvious, automating some of the labor in blade manufacture can lead to more U.S. jobs because it improves the economics of domestic blades versus imported blades.”

“The motive of this research was to develop automation methods that could be used to make domestically manufactured blades cost competitive globally,” Huth said. “Currently offshore blades are not produced in the U.S. due to high labor rates. The finishing process is very labor intensive and has a high job-turnover rate due to the harsh nature of the work. By automating the finishing process, domestic offshore blade manufacturing can become more economically viable.”

The research was conducted at the Composites Manufacturing Education and Technology (CoMET) facility at NREL’s Flatirons Campus. The robot worked on a 5-meter-long blade segment. Wind turbine blades are considerably longer, but because they bend and deflect under their own weight, a robot would have to be programmed to work on the bigger blades section by section.

The researchers used a series of scans to create a 3D representation of the position of the blade and to identify precisely the front and rear sections of the airfoil — a special shape of the blade that helps the air flow smoothly over the blade. From there, the team programmed the robot to perform a series of tasks, after which it was judged on accuracy and speed. The researchers found areas for improvement, particularly when it came to grinding. The robot ground down too much in some parts of the blade and not enough in others.

“As we’ve gone through this research, we’ve been moving the goal posts for what this system needs to do to be effective,” Huth said.

Stretchable hybrid response pressure sensors

by Kyoung-Ho Ha, Zhengjie Li, Sangjun Kim, Heeyong Huh, Zheliang Wang, Hongyang Shi, Charles Block, Sarnab Bhattacharya, Nanshu Lu in Matter

Robots and other devices with the same softness and touch sensitivity as human skin, opening up new possibilities to perform tasks that require a great deal of precision and control of force.

The new stretchable e-skin, developed by researchers at The University of Texas at Austin, solves a major bottleneck in the emerging technology. Existing e-skin technology loses sensing accuracy as the material stretches, but that is not the case with this new version.

“Much like human skin has to stretch and bend to accommodate our movements, so too does e-skin,” said Nanshu Lu, a professor in the Cockrell School of Engineering’s Department of Aerospace Engineering and Engineering Mechanics who led the project. “No matter how much our e-skin stretches, the pressure response doesn’t change, and that is a significant achievement.”

Lu envisions the stretchable e-skin as a critical component to a robot hand capable of the same level of softness and sensitivity in touch as a human hand. This could be applied to medical care, where robots could check a patient’s pulse, wipe the body or massage a body part.

Why is a robot nurse or physical therapist necessary? Around the world, millions of people are aging and in need of care, more than the global medical system can provide.

“In the future, if we have more elderly than available caregivers, it’s going to be a crisis worldwide,” Lu said. “We need to find new ways to take care of people efficiently and also gently, and robots are an important piece of that puzzle.”

Beyond medicine, human-caring robots could be deployed in disasters. They could search for injured and trapped people in an earthquake or a collapsed building, for example, and apply on-the-spot care, such as administering CPR.

E-skin technology senses pressure from contact, letting the attached machine know how much force to use to, for example, grab a cup or touch a person. But, when conventional e-skin is stretched, it also senses that deformation. That reading creates additional noise that skews the sensors’ ability to sense the pressure. That could lead to a robot using too much force to grab something.

In demonstrations, the stretchability allowed the researchers to create inflatable probes and grippers that could change shape to perform a variety of sensitive, touch-based tasks. The inflated skin-wrapped probe was used on human subjects to capture their pulse and pulse waves accurately. The deflated grippers can conformably hold on to a tumbler without dropping it, even when a coin is dropped inside. The device also pressed on a crispy taco shell without breaking it.

The key to this discovery is an innovative hybrid response pressure sensor that Lu and collaborators have been working on for years. While conventional e-skins are either capacitive or resistive, the hybrid response e-skin employs both responses to pressure. Perfecting these sensors, and combining them with stretchable insulating and electrode materials, enabled this e-skin innovation.

High-throughput genetic manipulation of multicellular organisms using a machine-vision guided embryonic microinjection robot

by Andrew D Alegria, Amey S Joshi, Jorge Blanco Mendana, Kanav Khosla, Kieran T Smith, Benjamin Auch, Margaret Donovan, John Bischof, Daryl M Gohl, Suhasa B Kodandaramaiah in GENETICS

University of Minnesota Twin Cities researchers have constructed a robot that uses machine learning to fully automate a complicated microinjection process used in genetic research.

In their experiments, the researchers were able to use this automated robot to manipulate the genetics of multicellular organisms, including fruit fly and zebrafish embryos. The technology will save labs time and money while enabling them to more easily conduct new, large-scale genetic experiments that were not possible previously using manual techniques

The work was co-led by two University of Minnesota mechanical engineering graduate students Andrew Alegria and Amey Joshi. The team is also working to commercialize this technology to make it widely available through the University of Minnesota start-up company, Objective Biotechnology.

Microinjection is a method for introducing cells, genetic material, or other agents directly into embryos, cells, or tissues using a very fine pipette. The researchers have trained the robot to detect embryos that are one-hundredth the size of a grain of rice. After detection, the machine can calculate a path and automate the process of the injections.

“This new process is more robust and reproducible than manual injections,” said Suhasa Kodandaramaiah, a University of Minnesota mechanical engineering associate professor and senior author of the study. “With this model, individual laboratories will be able to think of new experiments that you couldn’t do without this type of technology.”

Robot hardware and operation.

Typically, this type of research requires highly skilled technicians to perform the microinjection, which many laboratories do not have. This new technology could expand the ability to perform large experiments in labs, while reducing time and costs.

“This is very exciting for the world of genetics. Writing and reading DNA have drastically improved in recent years, but having this technology will increase our ability to perform large-scale genetic experiments in a wide range of organisms,” said Daryl Gohl, a co-author of the study, the group leader of the University of Minnesota Genomics Center’s Innovation Lab and research assistant professor in the Department of Genetics, Cell Biology, and Development.

Not only can this technology be used in genetic experiments, but it can also help to preserve endangered species through cryopreservation, a preservation technique conducted at ultra-low temperatures.

“You can use this robot to inject nanoparticles into cells and tissues that helps in cryopreservation and in the process of rewarming afterwards,” Kodandaramaiah explained.

Other team members highlighted other applications for the technology that could have even more impact.

“We hope that this technology could eventually be used for in vitro fertilization, where you could detect those eggs on the microscale level,” said Andrew Alegria, co-lead author on the paper and University of Minnesota mechanical engineering graduate research assistant in the Biosensing and Biorobotics Lab.

Maximum diffusion reinforcement learning

by Thomas A. Berrueta, Allison Pinosky, Todd D. Murphey in Nature Machine Intelligence

Northwestern University engineers have developed a new artificial intelligence (AI) algorithm designed specifically for smart robotics. By helping robots rapidly and reliably learn complex skills, the new method could significantly improve the practicality — and safety — of robots for a range of applications, including self-driving cars, delivery drones, household assistants and automation.

Called Maximum Diffusion Reinforcement Learning (MaxDiff RL), the algorithm’s success lies in its ability to encourage robots to explore their environments as randomly as possible in order to gain a diverse set of experiences. This “designed randomness” improves the quality of data that robots collect regarding their own surroundings. And, by using higher-quality data, simulated robots demonstrated faster and more efficient learning, improving their overall reliability and performance.

When tested against other AI platforms, simulated robots using Northwestern’s new algorithm consistently outperformed state-of-the-art models. The new algorithm works so well, in fact, that robots learned new tasks and then successfully performed them within a single attempt — getting it right the first time. This starkly contrasts current AI models, which enable slower learning through trial and error.

“Other AI frameworks can be somewhat unreliable,” said Northwestern’s Thomas Berrueta, who led the study. “Sometimes they will totally nail a task, but, other times, they will fail completely. With our framework, as long as the robot is capable of solving the task at all, every time you turn on your robot you can expect it to do exactly what it’s been asked to do. This makes it easier to interpret robot successes and failures, which is crucial in a world increasingly dependent on AI.”

Berrueta is a Presidential Fellow at Northwestern and a Ph.D. candidate in mechanical engineering at the McCormick School of Engineering. Robotics expert Todd Murphey, a professor of mechanical engineering at McCormick and Berrueta’s adviser, is the paper’s senior author. Berrueta and Murphey co-authored the paper with Allison Pinosky, also a Ph.D. candidate in Murphey’s lab.

While the researchers have only tested their new algorithm on simulated robots, they built NoodleBot for future testing in the real world.

To train machine-learning algorithms, researchers and developers use large quantities of big data, which humans carefully filter and curate. AI learns from this training data, using trial and error until it reaches optimal results. While this process works well for disembodied systems, like ChatGPT and Google Gemini (formerly Bard), it does not work for embodied AI systems like robots. Robots, instead, collect data by themselves — without the luxury of human curators.

“Traditional algorithms are not compatible with robotics in two distinct ways,” Murphey said. “First, disembodied systems can take advantage of a world where physical laws do not apply. Second, individual failures have no consequences. For computer science applications, the only thing that matters is that it succeeds most of the time. In robotics, one failure could be catastrophic.”

To solve this disconnect, Berrueta, Murphey and Pinosky aimed to develop a novel algorithm that ensures robots will collect high-quality data on-the-go. At its core, MaxDiff RL commands robots to move more randomly in order to collect thorough, diverse data about their environments. By learning through self-curated random experiences, robots acquire necessary skills to accomplish useful tasks.

To test the new algorithm, the researchers compared it against current, state-of-the-art models. Using computer simulations, the researchers asked simulated robots to perform a series of standard tasks. Across the board, robots using MaxDiff RL learned faster than the other models. They also correctly performed tasks much more consistently and reliably than others.

Perhaps even more impressive: Robots using the MaxDiff RL method often succeeded at correctly performing a task in a single attempt. And that’s even when they started with no knowledge.

“Our robots were faster and more agile — capable of effectively generalizing what they learned and applying it to new situations,” Berrueta said. “For real-world applications where robots can’t afford endless time for trial and error, this is a huge benefit.”

Because MaxDiff RL is a general algorithm, it can be used for a variety of applications. The researchers hope it addresses foundational issues holding back the field, ultimately paving the way for reliable decision-making in smart robotics.

“This doesn’t have to be used only for robotic vehicles that move around,” Pinosky said. “It also could be used for stationary robots — such as a robotic arm in a kitchen that learns how to load the dishwasher. As tasks and physical environments become more complicated, the role of embodiment becomes even more crucial to consider during the learning process. This is an important step toward real systems that do more complicated, more interesting tasks.”

Low-Cost and Easy-to-Build Soft Robotic Skin for Safe and Contact-Rich Human–Robot Collaboration

by Kyungseo Park, Kazuki Shin, Sankalp Yamsani, Kevin Gim, Joohyung Kim in IEEE Transactions on Robotics

Soft skin coverings and touch sensors have emerged as a promising feature for robots that are both safer and more intuitive for human interaction, but they are expensive and difficult to make. A recent study demonstrates that soft skin pads doubling as sensors made from thermoplastic urethane can be efficiently manufactured using 3D printers.

“Robotic hardware can involve large forces and torques, so it needs to be made quite safe if it’s going to either directly interact with humans or be used in human environments,” said project lead Joohyung Kim, a professor of electrical & computer engineering at the University of Illinois Urbana-Champaign. “It’s expected that soft skin will play an important role in this regard since it can be used for both mechanical safety compliance and tactile sensing.

As reported in the journal IEEE Transactions on Robotics, the 3D-printed pads function as both soft skin for a robotic arm and pressure-based mechanical sensors. The pads have airtight seals and connect to pressure sensors. Like a squeezed balloon, the pad deforms when it touches something, and the displaced air activates the pressure sensor.

Kim explained, “Tactile robotic sensors usually contain very complicated arrays of electronics and are quite expensive, but we have shown that functional, durable alternatives can be made very cheaply. Moreover, since it’s just a question of reprogramming a 3D printer, the same technique can be easily customized to different robotic systems.”

Concept of the soft robotic skin. The robotic skin consists of soft pneumatic pads and sensing electronics. The developed robotic skin was able to sense slow (continuous) and transient (dynamic) tactile stimuli.

The researchers demonstrated that this functionality can be naturally used for safety: if the pads detect anything near a dangerous area such as a joint, the arm automatically stops. They can also be used for operational functionality with the robot interpreting touches and taps as instructions.

Since 3D-printed parts are comparatively simple and inexpensive to manufacture, they can be easily adapted to new robotic systems and replaced. Kim noted that this feature is desirable in applications where cleaning and maintaining parts is expensive or infeasible.

“Imagine you want to use soft-skinned robots to assist in a hospital setting,” he said. “They would need to be regularly sanitized, or the skin would need to be regularly replaced. Either way,there’s a huge cost. However, 3D printing is a very scalable process, so interchangeable parts can be inexpensively made and easily snapped on and off the robot body.”

Tactile inputs like the kind provided by the new pads are a relatively unexplored facet of robotic sensing and control. Kim hopes that the ease of this new manufacturing technique will inspire more interest.

“Right now, computer vision and language models are the two major ways that humans can interact with robotic systems, but there is a need for more data on physical interactions, or ‘force-level’ data,” he said. “From the robot’s point of view, this information is the most direct interaction with its environment, but there are very few users — mostly researchers — who think about this. Collecting this force-level data is a target task for me and my group.

Electrochemically actuated microelectrodes for minimally invasive peripheral nerve interfaces

by Chaoqun Dong, Alejandro Carnicer-Lombarte, Filippo Bonafè, Botian Huang, Sagnik Middya, Amy Jin, Xudong Tao, Sanggil Han, Manohar Bance, Damiano G. Barone, Beatrice Fraboni, George G. Malliaras in Nature Materials

Researchers have developed tiny, flexible devices that can wrap around individual nerve fibres without damaging them.

The researchers, from the University of Cambridge, combined flexible electronics and soft robotics techniques to develop the devices, which could be used for the diagnosis and treatment of a range of disorders, including epilepsy and chronic pain, or the control of prosthetic limbs.

Current tools for interfacing with the peripheral nerves — the 43 pairs of motor and sensory nerves that connect the brain and the spinal cord — are outdated, bulky and carry a high risk of nerve injury. However, the robotic nerve ‘cuffs’ developed by the Cambridge team are sensitive enough to grasp or wrap around delicate nerve fibres without causing any damage. Tests of the nerve cuffs in rats showed that the devices only require tiny voltages to change shape in a controlled way, forming a self-closing loop around nerves without the need for surgical sutures or glues.

The researchers say the combination of soft electrical actuators with neurotechnology could be an answer to minimally invasive monitoring and treatment for a range of neurological conditions.

Electric nerve implants can be used to either stimulate or block signals in target nerves. For example, they might help relieve pain by blocking pain signals, or they could be used to restore movement in paralysed limbs by sending electrical signals to the nerves. Nerve monitoring is also standard surgical procedure when operating in areas of the body containing a high concentration of nerve fibres, such as anywhere near the spinal cord.

These implants allow direct access to nerve fibres, but they come with certain risks. “Nerve implants come with a high risk of nerve injury,” said Professor George Malliaras from Cambridge’s Department of Engineering, who led the research. “Nerves are small and highly delicate, so anytime you put something large, like an electrode, in contact with them, it represents a danger to the nerves.”

“Nerve cuffs that wrap around nerves are the least invasive implants currently available, but despite this they are still too bulky, stiff and difficult to implant, requiring significant handling and potential trauma to the nerve,” said co-author Dr Damiano Barone from Cambridge’s Department of Clinical Neurosciences.

The researchers designed a new type of nerve cuff made from conducting polymers, normally used in soft robotics. The ultra-thin cuffs are engineered in two separate layers. Applying tiny amounts of electricity — just a few hundred millivolts — causes the devices to swell or shrink. The cuffs are small enough that they could be rolled up into a needle and injected near the target nerve. When activated electrically, the cuffs will change their shape to wrap around the nerve, allowing nerve activity to be monitored or altered.

“To ensure the safe use of these devices inside the body, we have managed to reduce the voltage required for actuation to very low values,” said Dr Chaoqun Dong, the paper’s first author. “What’s even more significant is that these cuffs can change shape in both directions and be reprogrammed. This means surgeons can adjust how tightly the device fits around a nerve until they get the best results for recording and stimulating the nerve.”

Tests in rats showed that the cuffs could be successfully placed without surgery, and they formed a self-closing loop around the target nerve. The researchers are planning further testing of the devices in animal models, and are hoping to begin testing in humans within the next few years.

Embedding Bifurcations into Pneumatic Artificial Muscle

by Nozomi Akashi, Yasuo Kuniyoshi, Taketomo Jo, Mitsuhiro Nishida, Ryo Sakurai, Yasumichi Wakao, Kohei Nakajima in Advanced Science

Creating robots to safely aid disaster victims is one challenge; executing flexible robot control that takes advantage of the material’s softness is another. The use of pliable soft materials to collaborate with humans and work in disaster areas has drawn much recent attention. However, controlling soft dynamics for practical applications has remained a significant challenge.

In collaboration with the University of Tokyo and Bridgestone Corporation, Kyoto University has now developed a method to control pneumatic artificial muscles, which are soft robotic actuators. Rich dynamics of these drive components can be exploited as a computational resource.

“We’ve demonstrated the actuator’s capability to autonomously generate diverse dynamics, including rhythmic patterns and chaos,” explains Nozomi Akashi of KyotoU’s Graduate School of Informatics.

Pneumatic artificial muscles, measurement systems, and pneumatic artificial muscle dynamics.

Traditionally, patterns were generated by externally attaching oscillators to robots, enabling locomotion and repetitive motions. However, these oscillators should be removed from the robot to retain their softness. Akashi’s team addresses this difficult issue to bring out the soft robots’ potential.

“In addition, the pattern-changing bifurcation structures can be embedded into the robotic actuator itself,” says Kohei Nakajima of the University of Tokyo’s Graduate School of Information Science and Technology.

The findings suggest that robots can generate qualitatively different patterns outside the learning data, paving the way for the development of robots capable of more adaptable and flexible movements.

“This could streamline the hardware and software development process, making it more efficient and effective,” concludes Akashi.

Clinical Outcomes of Standardized Central Venous Catheterization Simulation Training: A Comparative Analysis

by Jessica M. Gonzalez-Vargas, Elizabeth Sinz, Jason Z. Moore, Scarlett R. Miller in Journal of Surgical Education

More than five million central lines are placed in patients who need prolonged drug delivery, such as those undergoing cancer treatments, in the United States every year, yet the common procedure can lead to a bevy of complications in almost a million of those cases. To help decrease the rate of infections, blood clots and other complications associated with placing a central line catheter, Penn State researchers developed an online curriculum coupled with a hands-on simulation training to provide trainee physicians with more practice.

Deployed in 2022 at the Penn State College of Medicine, the researchers recently assessed how the training impacted the prevalence of central line complications by comparing error rates from 2022–23, when the training had been fully implemented, to two prior years, 2016–17 and 2017–18, from before implementing the training. They found that all complication types — mechanical issues, infections and blood clots — were significantly lower after the training was launched.

The researchers hold patents on the technology used in this work. In addition to working to improve the central line placement training, the team is also applying the framework to other common procedures with high complication rates, such as colonoscopies and laparoscopic surgeries.

“Our approach is focused on reducing preventable errors — this paper is the first significant clinical evidence that we are moving the needle on the gap in clinical education and clinical practice,” said Scarlett Miller, professor of industrial engineering and of mechanical engineering at Penn State and principal investigator on the project. “If we ensure physicians going through residency training are proficient in a skill, like placing central lines, we can minimize the risk on human life.”

Traditional training for placing a central line and other routine surgical procedures starts with a resident watching a more senior doctor complete the process. Then, the resident is expected to do the procedure themselves, and, finally, they teach someone else to do the procedure.

“The problem with that approach is that there are very few checks in the process, and the resident only improves by working with patients — who are at risk of complications,” Miller said. “The simulation approach allows someone to try the procedure hundreds, thousands of times without putting anyone at risk.”

The new approach — the result of interdisciplinary work between engineers and clinicians, Miller said — uses online- and simulation-based training to perform standardized ultrasound-guided internal jugular central venous catheterization (US-IJCVC), which is a central line placed into the internal jugular vein via the neck.

Residents first complete online training, which includes pre- and post-tests to evaluate knowledge gained. They then take that knowledge and apply in a skills lab, where they practice placing the central line on a novel dynamic haptic robotic trainer that can simulate various conditions and reactions. Residents can use ultrasound to image the line placement, like they would on a real person, on the robotic trainer, which offers automated feedback.

“We started with 25 surgical residents at the Penn State Health Milton S. Hershey Medical Center, then expanded to all of the residents at Hershey and partnered with Cedars-Sinai Medical Center in Los Angeles to bring the training to their residents,” Miller said. “In total, we have trained about 700 physicians to date, and we train about 200 a year with our current funding.”

It seems practice may get physicians closer to perfect, without the risk to human life, according to Miller. In this study, Miller and her team compared error rates from 2022, the first year the simulation training was fully deployed, to error rates from 2016 and 2017, when the training was not yet established. They did not use data from 2018–21, as the training was partially implemented but undergoing startup adjustments and challenges related to COVID that could not be controlled for a direct comparison. The researchers found that the range of reported error rates for mechanical complications — such as puncturing an artery or misplacing the catheter — increased from 10.4% in 2016 to 12.4% in 2017 but dropped to 7.3% in 2022. The same trend continued for error rates related to infections, with the 6.6% rate in 2016 increasing to 7.6% in 2017 and dropping to 4.1% in 2022. For blood clots, the error rates decreased from 12.3% in 2016 to 11.4% in 2017 to 8.1% in 2022.

“We’re very motivated by the results to improve the system and hopefully expand it to other hospitals,” Miller said. “We’re reducing the error rates in a significant way, but we want more. We want zero errors.”

Bioinspired multiscale adaptive suction on complex dry surfaces enhanced by regulated water secretion

by Tianqi Yue, Weiyong Si, Alex Keller, Chenguang Yang, Hermes Bloomfield-Gadêlha, Jonathan Rossiter in Proceedings of the National Academy of Sciences

A new robotic suction cup which can grasp rough, curved and heavy stone, has been developed by scientists at the University of Bristol.

The team, based at Bristol Robotics Laboratory, studied the structures of octopus biological suckers, which have superb adaptive suction abilities enabling them to anchor to rock. In their findings, the researchers show how they were able create a multi-layer soft structure and an artificial fluidic system to mimic the musculature and mucus structures of biological suckers.

Suction is a highly evolved biological adhesion strategy for soft-body organisms to achieve strong grasping on various objects. Biological suckers can adaptively attach to dry complex surfaces such as rocks and shells, which are extremely challenging for current artificial suction cups. Although the adaptive suction of biological suckers is believed to be the result of their soft body’s mechanical deformation, some studies imply that in-sucker mucus secretion may be another critical factor in helping attach to complex surfaces, thanks to its high viscosity.

Suction cup grasping a stone.

Lead author Tianqi Yue explained: “The most important development is that we successfully demonstrated the effectiveness of the combination of mechanical conformation — the use of soft materials to conform to surface shape, and liquid seal — the spread of water onto the contacting surface for improving the suction adaptability on complex surfaces. This may also be the secret behind biological organisms ability to achieve adaptive suction.”

Their multi-scale suction mechanism is an organic combination of mechanical conformation and regulated water seal. Multi-layer soft materials first generate a rough mechanical conformation to the substrate, reducing leaking apertures to just micrometres. The remaining micron-sized apertures are then sealed by regulated water secretion from an artificial fluidic system based on the physical model, thereby the suction cup achieves long suction longevity on diverse surfaces but with minimal overflow.

Tianqi added: “We believe the presented multi-scale adaptive suction mechanism is a powerful new adaptive suction strategy which may be instrumental in the development of versatile soft adhesion. “Current industrial solutions use always-on air pumps to actively generate the suction however, these are noisy and waste energy.

“With no need for a pump, it is well known that many natural organisms with suckers, including octopuses, some fishes such as suckerfish and remoras, leeches, gastropods and echinoderms, can maintain their superb adaptive suction on complex surfaces by exploiting their soft body structures.”

The findings have great potential for industrial applications, such as providing a next-generation robotic gripper for grasping a variety of irregular objects.

Subscribe to Paradigm!

Medium. Twitter. Telegram. Telegram Chat. Reddit. LinkedIn.

Main sources

Research articles

Science Robotics

Science Daily

IEEE Spectrum