RT/ New electronic skin can react to pain like human skin

Paradigm
Paradigm
Published in
25 min readSep 4, 2020

Robotics biweekly vol.12, 21st August — 4th September

TL;DR

  • Researchers have developed electronic artificial skin that reacts to pain just like real skin, opening the way to better prosthetics, smarter robotics and non-invasive alternatives to skin grafts.
  • A collaboration has created the first microscopic robots that incorporate semiconductor components, allowing them to be controlled — and made to walk — with standard electronic signals.
  • Teleoperated surgical robots are becoming commonplace in operating rooms, but many are massive (sometimes taking up an entire room) and difficult to manipulate. Medical researchers and engineers have now created the mini-RCM, a surgical robot the size of a tennis ball that weighs as much as a penny, and performed significantly better than manually operated tools in delicate mock-surgical procedures.
  • Scientists have devised a novel approach to getting physically separated fish to interact with each other, leading to insights about what kinds of cues influence social behavior.
  • Robotic support pets used to reduce depression in older adults and people with dementia acquire bacteria over time, but a simple cleaning procedure can help them from spreading illnesses, according to a new study.
  • During the current coronavirus pandemic, one of the riskiest parts of a health care worker’s job is assessing people who have symptoms of Covid-19. Researchers from MIT and Brigham and Women’s Hospital hope to reduce that risk by using robots to remotely measure patients’ vital signs.
  • Although true ‘cyborgs’ are science fiction, researchers are moving toward integrating electronics with the body. Such devices could monitor tumors or replace damaged tissues. But connecting electronics directly to human tissues in the body is a huge challenge. Today, a team is reporting new coatings for components that could help them more easily fit into this environment.
  • Tokyo startup Telexistence has recently unveiled a new robot called the Model-T, an advanced teleoperated humanoid that can use tools and grasp a wide range of objects.
  • The University of Michigan has a fancy new treadmill that’s built right into the floor, which proves to be a bit much for Mini Cheetah.
  • Researchers from the Wyss Institute, Harvard SEAS, and Sony have created the mini-RCM, a small surgical robot that can help surgeons perform delicate teleoperated procedures on the human body
  • Something is different, and you can’t quite put your finger on it. But your robot can.
  • The new August issue of ‘Science Robotics’ is out! Introducing a methanol-powered beetle bot, flexible zinc-air batteries, and more to come.
  • Check out robotics upcoming events below. And more!

Robotics market

The global market for robots is expected to grow at a compound annual growth rate (CAGR) of around 26 percent to reach just under 210 billion U.S. dollars by 2025. It is predicted that this market will hit the 100 billion U.S. dollar mark in 2020.

Size of the global market for industrial and non-industrial robots between 2018 and 2025(in billion U.S. dollars):

Size of the global market for industrial and non-industrial robots between 2018 and 2025(in billion U.S. dollars). Source: Statista

Latest Researches

Artificial Somatosensors: Feedback Receptors for Electronic Skins

by Md. Ataur Rahman, Sumeet Walia, Sumaiya Naznee, Mohammad Taha, Shruti Nirantar, Fahmida Rahman, Madhu Bhaskaran, Sharath Sriram in Advanced Intelligent Systems

Researchers have developed electronic artificial skin that reacts to pain just like real skin, opening the way to better prosthetics, smarter robotics and non-invasive alternatives to skin grafts.

The prototype device developed by a team at RMIT University in Melbourne, Australia, can electronically replicate the way human skin senses pain.

The device mimics the body’s near-instant feedback response and can react to painful sensations with the same lighting speed that nerve signals travel to the brain.

Lead researcher Professor Madhu Bhaskaran said the pain-sensing prototype was a significant advance towards next-generation biomedical technologies and intelligent robotics.

“Skin is our body’s largest sensory organ, with complex features designed to send rapid-fire warning signals when anything hurts,” Bhaskaran said.

“We’re sensing things all the time through the skin but our pain response only kicks in at a certain point, like when we touch something too hot or too sharp. No electronic technologies have been able to realistically mimic that very human feeling of pain — until now. Our artificial skin reacts instantly when pressure, heat or cold reach a painful threshold. It’s a critical step forward in the future development of the sophisticated feedback systems that we need to deliver truly smart prosthetics and intelligent robotics.”

Working principle of biological and implemented artificial receptors. a) Biological receptor for pressure stimulus for Pacinian corpuscle and noxious stimulus for thermal receptor and nociceptor. b) Artificial Pacinian corpuscle when no pressure is applied with moderate current flow in both circuit paths. c) Artificial Pacinian corpuscle under applied pressure showing remarkably high current flow through the memristor. d) Artificial thermoreceptor and nociceptor when no temperature is applied with no current flow through the memristor. e) Artificial thermoreceptor and nociceptor when temperature is applied, initiating a conductive filament (red dots) in the memristor, resulting in high current flow.

Functional sensing prototypes

As well as the pain-sensing prototype, the research team has also developed devices using stretchable electronics that can sense and respond to changes in temperature and pressure.

Bhaskaran, co-leader of the Functional Materials and Microsystems group at RMIT, said the three functional prototypes were designed to deliver key features of the skin’s sensing capability in electronic form.

With further development, the stretchable artificial skin could also be a future option for non-invasive skin grafts, where the traditional approach is not viable or not working.

“We need further development to integrate this technology into biomedical applications but the fundamentals — biocompatibility, skin-like stretchability — are already there,” Bhaskaran said.

How to make electronic skin

The new research combines three technologies previously pioneered and patented by the team:

  • Stretchable electronics: combining oxide materials with biocompatible silicon to deliver transparent, unbreakable and wearable electronics as thin as a sticker.
  • Temperature-reactive coatings: self-modifying coatings 1,000 times thinner than a human hair based on a material that transforms in response to heat.
  • Brain-mimicking memory: electronic memory cells that imitate the way the brain uses long-term memory to recall and retain previous information.

The pressure sensor prototype combines stretchable electronics and long-term memory cells, the heat sensor brings together temperature-reactive coatings and memory, while the pain sensor integrates all three technologies.

PhD researcher Md Ataur Rahman said the memory cells in each prototype were responsible for triggering a response when the pressure, heat or pain reached a set threshold.

“We’ve essentially created the first electronic somatosensors — replicating the key features of the body’s complex system of neurons, neural pathways and receptors that drive our perception of sensory stimuli,” he said.

“While some existing technologies have used electrical signals to mimic different levels of pain, these new devices can react to real mechanical pressure, temperature and pain, and deliver the right electronic response. It means our artificial skin knows the difference between gently touching a pin with your finger or accidentally stabbing yourself with it — a critical distinction that has never been achieved before electronically.”

Electronically integrated, mass-manufactured, microscopic robots

by Marc Z. Miskin, Alejandro J. Cortese, Kyle Dorsey, Edward P. Esposito, Michael F. Reynolds, Qingkun Liu, Michael Cao, David A. Muller, Paul L. McEuen, Itai Cohen in Nature

A collaboration has created the first microscopic robots that incorporate semiconductor components, allowing them to be controlled — and made to walk — with standard electronic signals.

These robots, roughly the size of paramecium, provide a template for building even more complex versions that utilize silicon-based intelligence, can be mass produced, and may someday travel through human tissue and blood.

The collaboration is led by Itai Cohen, professor of physics, Paul McEuen, the John A. Newman Professor of Physical Science and their former postdoctoral researcher Marc Miskin, who is now an assistant professor at the University of Pennsylvania.

The walking robots are the latest iteration, and in many ways an evolution, of Cohen and McEuen’s previous nanoscale creations, from microscopic sensors to graphene-based origami machines.

The new robots are about 5 microns thick (a micron is one-millionth of a meter), 40 microns wide and range from 40 to 70 microns in length. Each bot consists of a simple circuit made from silicon photovoltaics — which essentially functions as the torso and brain — and four electrochemical actuators that function as legs.

The researchers control the robots by flashing laser pulses at different photovoltaics, each of which charges up a separate set of legs. By toggling the laser back and forth between the front and back photovoltaics, the robot walks.

The robots are certainly high-tech, but they operate with low voltage (200 millivolts) and low power (10 nanowatts), and remain strong and robust for their size. Because they are made with standard lithographic processes, they can be fabricated in parallel: About 1 million bots fit on a 4-inch silicon wafer.

The researchers are exploring ways to soup up the robots with more complicated electronics and onboard computation — improvements that could one day result in swarms of microscopic robots crawling through and restructuring materials, or suturing blood vessels, or being dispatched en masse to probe large swaths of the human brain.

“Controlling a tiny robot is maybe as close as you can come to shrinking yourself down. I think machines like these are going to take us into all kinds of amazing worlds that are too small to see,” said Miskin, the study’s lead author.

“This research breakthrough provides exciting scientific opportunity for investigating new questions relevant to the physics of active matter and may ultimately lead to futuristic robotic materials,” said Sam Stanton, program manager for the Army Research Office, an element of the Combat Capabilities Development Command’s Army Research Laboratory, which supported the research.

Origami-inspired miniature manipulator for teleoperated microsurgery

by Hiroyuki Suzuki, Robert J. Wood in Nature Machine Intelligence

Teleoperated surgical robots are becoming commonplace in operating rooms, but many are massive (sometimes taking up an entire room) and difficult to manipulate. Medical researchers and engineers have now created the mini-RCM, a surgical robot the size of a tennis ball that weighs as much as a penny, and performed significantly better than manually operated tools in delicate mock-surgical procedures.

Minimally invasive laparoscopic surgery, in which a surgeon uses tools and a tiny camera inserted into small incisions to perform operations, has made surgical procedures safer for both patients and doctors over the last half-century. Recently, surgical robots have started to appear in operating rooms to further assist surgeons by allowing them to manipulate multiple tools at once with greater precision, flexibility, and control than is possible with traditional techniques. However, these robotic systems are extremely large, often taking up an entire room, and their tools can be much larger than the delicate tissues and structures on which they operate.

A collaboration between Wyss Associate Faculty member Robert Wood, Ph.D. and Robotics Engineer Hiroyuki Suzuki of Sony Corporation has brought surgical robotics down to the microscale by creating a new, origami-inspired miniature remote center of motion manipulator (the “mini-RCM”). The robot is the size of a tennis ball, weighs about as much as a penny, and successfully performed a difficult mock surgical task, as described in a recent issue of Nature Machine Intelligence.

“The Wood lab’s unique technical capabilities for making micro-robots have led to a number of impressive inventions over the last few years, and I was convinced that it also had the potential to make a breakthrough in the field of medical manipulators as well,” said Suzuki, who began working with Wood on the mini-RCM in 2018 as part of a Harvard-Sony collaboration. “This project has been a great success.”

A mini robot for micro tasks

To create their miniature surgical robot, Suzuki and Wood turned to the Pop-Up MEMS manufacturing technique developed in Wood’s lab, in which materials are deposited on top of each other in layers that are bonded together, then laser-cut in a specific pattern that allows the desired three-dimensional shape to “pop up,” as in a children’s pop-up picture book. This technique greatly simplifies the mass-production of small, complex structures that would otherwise have to be painstakingly constructed by hand.

The team created a parallelogram shape to serve as the main structure of the robot, then fabricated three linear actuators (mini-LAs) to control the robot’s movement: one parallel to the bottom of the parallelogram that raises and lowers it, one perpendicular to the parallelogram that rotates it, and one at the tip of the parallelogram that extends and retracts the tool in use. The result was a robot that is much smaller and lighter than other microsurgical devices previously developed in academia.

The mini-LAs are themselves marvels in miniature, built around a piezoelectric ceramic material that changes shape when an electrical field is applied. The shape change pushes the mini-LA’s “runner unit” along its “rail unit” like a train on train tracks, and that linear motion is harnessed to move the robot. Because piezoelectric materials inherently deform as they change shape, the team also integrated LED-based optical sensors into the mini-LA to detect and correct any deviations from the desired movement, such as those caused by hand tremors.

Steadier than a surgeon’s hands

To mimic the conditions of a teleoperated surgery, the team connected the mini-RCM to a Phantom Omni device, which manipulated the mini-RCM in response to the movements of a user’s hand controlling a pen-like tool. Their first test evaluated a human’s ability to trace a tiny square smaller than the tip of a ballpoint pen, looking through a microscope and either tracing it by hand, or tracing it using the mini-RCM. The mini-RCM tests dramatically improved user accuracy, reducing error by 68% compared to manual operation — an especially important quality given the precision required to repair small and delicate structures in the human body.

Given the mini-RCM’s success on the tracing test, the researchers then created a mock version of a surgical procedure called retinal vein cannulation, in which a surgeon must carefully insert a needle through the eye to inject therapeutics into the tiny veins at the back of the eyeball. They fabricated a silicone tube the same size as the retinal vein (about twice the thickness of a human hair), and successfully punctured it with a needle attached to the end of the mini-RCM without causing local damage or disruption.

In addition to its efficacy in performing delicate surgical maneuvers, the mini-RCM’s small size provides another important benefit: it is easy to set up and install and, in the case of a complication or electrical outage, the robot can be easily removed from a patient’s body by hand.

“The Pop-Up MEMS method is proving to be a valuable approach in a number of areas that require small yet sophisticated machines, and it was very satisfying to know that it has the potential to improve the safety and efficiency of surgeries to make them even less invasive for patients,” said Wood, who is also the Charles River Professor of Engineering and Applied Sciences at Harvard’s John A. Paulson School of Engineering and Applied Sciences (SEAS).

The researchers aim to increase the force of the robot’s actuators to cover the maximum forces experienced during an operation, and improve its positioning precision. They are also investigating using a laser with a shorter pulse during the machining process, to improve the mini-LAs’ sensing resolution.

“This unique collaboration between the Wood lab and Sony illustrates the benefits that can arise from combining the real-world focus of industry with the innovative spirit of academia, and we look forward to seeing the impact this work will have on surgical robotics in the near future,” said Wyss Institute Founding Director Don Ingber, M.D., Ph.D., who is also the the Judah Folkman Professor of Vascular Biology at Harvard Medical School and Boston Children’s Hospital, and Professor of Bioengineering at SEAS.

Behavioral Teleporting of Individual Ethograms onto Inanimate Robots: Experiments on Social Interactions in Live Zebrafish

by Mert Karakaya, Simone Macrì, Maurizio Porfiri in iScience

Researchers have devised a novel approach to getting physically separated fish to interact with each other, leading to insights about what kinds of cues influence social behavior.

The team, led by Maurizio Porfiri, Institute Professor at NYU Tandon, devised a novel approach to getting physically separated fish to interact with each other, leading to insights about what kinds of cues influence social behavior.

The innovative system, called “behavioral teleporting” — the transfer of the complete inventory of behaviors and actions (ethogram) of a live zebrafish onto a remotely located robotic replica — allowed the investigators to independently manipulate multiple factors underpinning social interactions in real-time. The research, “Behavioral teleporting of individual ethograms onto inanimate robots: experiments on social interactions in live zebrafish,” appears in the Cell Press journal iScience.

The team, including Mert Karakaya, a Ph.D. candidate in the Department of Mechanical and Aerospace Engineering at NYU Tandon, and Simone Macrì of the Centre for Behavioral Sciences and Mental Health, Istituto Superiore di Sanità, Rome, devised a setup consisting of two separate tanks, each containing one fish and one robotic replica. Within each tank, the live fish of the pair swam with the zebrafish replica matching the morphology and locomotory pattern of the live fish located in the other tank.

An automated tracking system scored each of the live subjects’ locomotory patterns, which were, in turn, used to control the robotic replica swimming in the other tank via an external manipulator. Therefore, the system allowed the transfer of the complete ethogram of each fish across tanks within a fraction of a second, establishing a complex robotics-mediated interaction between two remotely-located live animals. By independently controlling the morphology of these robots, the team explored the link between appearance and movements in social behavior.

The investigators found that the replica teleported the fish motion in almost all trials (85% of the total experimental time), with a 95% accuracy at a maximum time lag of less than two-tenths of a second. The high accuracy in the replication of fish trajectory was confirmed by equivalent analysis on speed, turn rate, and acceleration.

Porfiri explained that the behavioral teleporting system avoids the limits of typical modeling using robots.

“Since existing approaches involve the use of a mathematical representation of social behavior for controlling the movements of the replica, they often lead to unnatural behavioral responses of live animals,” he said. “But because behavioral teleporting ‘copy/pastes’ the behavior of a live fish onto robotic proxies, it confers a high degree of precision with respect to such factors as position, speed, turn rate, and acceleration.”

Porfiri’s previous research proving robots are viable as behavior models for zebrafish showed that schools of zebrafish could be made to follow the lead of their robotic counterparts.

“In humans, social behavior unfolds in actions, habits, and practices that ultimately define our individual life and our society,” added Macrì. “These depend on complex processes, mediated by individual traits — baldness, height, voice pitch, and outfit, for example — and behavioral feedback, vectors that are often difficult to isolate. This new approach demonstrates that we canisolate influences on the quality of social interaction and determine which visual features really matter.”

The research included experiments to understand the asymmetric relationship between large and small fish and identify leader/follower roles, in which a large fish swam with a small replica that mirrored the behavior of the small fish positioned in the other tank and vice-versa.

Karakaya said the team was surprised to find that the smaller — not larger — fish “led” the interactions.

“There are no strongly conclusive results on why that could be, but one reason might be due to the ‘curious’ nature of the smaller individuals to explore a novel space,” he said. “In known environments, large fish tend to lead; however, in new environments larger and older animals can be cautious in their approach, whereas the smaller and younger ones could be ‘bolder.’”

The method also led to the discovery that interaction between fish was not determined by locomotor patterns alone, but also by appearance.

“It is interesting to see that, as is the case with our own species, there is a relationship between appearance and social interaction,” he added.

Karakaya added that this could serve as an important tool for human interactions in the near future, whereby, through the closed-loop teleporting, people could use robots as proxies of themselves.

“One example would be the colonies on Mars, where experts from Earth could use humanoid robots as an extension of themselves to interact with the environment and people there. This would provide easier and more accurate medical examination, improve human contact, and reduce isolation. Detailed studies on the behavioral and psychological effects of these proxies must be completed to better understand how these techniques can be implemented into daily life.”

This work was supported by the National Science Foundation, the National Institute on Drug Abuse, and the Office of Behavioral and Social Sciences Research.

‘Cyborg’ technology could enable new diagnostics, merger of humans and AI

Although true ‘cyborgs’ are science fiction, researchers are moving toward integrating electronics with the body. Such devices could monitor tumors or replace damaged tissues. But connecting electronics directly to human tissues in the body is a huge challenge. Today, a team is reporting new coatings for components that could help them more easily fit into this environment.

The researchers will present their results today at the American Chemical Society (ACS) Fall 2020 Virtual Meeting & Expo.

“We got the idea for this project because we were trying to interface rigid, inorganic microelectrodes with the brain, but brains are made out of organic, salty, live materials,” says David Martin, Ph.D., who led the study. “It wasn’t working well, so we thought there must be a better way.”

Traditional microelectronic materials, such as silicon, gold, stainless steel and iridium, cause scarring when implanted. For applications in muscle or brain tissue, electrical signals need to flow for them to operate properly, but scars interrupt this activity. The researchers reasoned that a coating could help.

“We started looking at organic electronic materials like conjugated polymers that were being used in non-biological devices,” says Martin, who is at the University of Delaware. “We found a chemically stable example that was sold commercially as an antistatic coating for electronic displays.” After testing, the researchers found that the polymer had the properties necessary for interfacing hardware and human tissue.

“These conjugated polymers are electrically active, but they are also ionically active,” Martin says. “Counter ions give them the charge they need so when they are in operation, both electrons and ions are moving around.” The polymer, known as poly(3,4-ethylenedioxythiophene) or PEDOT, dramatically improved the performance of medical implants by lowering their impedance two to three orders of magnitude, thus increasing signal quality and battery lifetime in patients.

Martin has since determined how to specialize the polymer, putting different functional groups on PEDOT. Adding a carboxylic acid, aldehyde or maleimide substituent to the ethylenedioxythiophene (EDOT) monomer gives the researchers the versatility to create polymers with a variety of functions.

“The maleimide is particularly powerful because we can do click chemistry substitutions to make functionalized polymers and biopolymers,” Martin says. Mixing unsubstituted monomer with the maleimide-substituted version results in a material with many locations where the team can attach peptides, antibodies or DNA. “Name your favorite biomolecule, and you can in principle make a PEDOT film that has whatever biofunctional group you might be interested in,” he says.

Most recently, Martin’s group created a PEDOT film with an antibody for vascular endothelial growth factor (VEGF) attached. VEGF stimulates blood vessel growth after injury, and tumors hijack this protein to increase their blood supply. The polymer that the team developed could act as a sensor to detect overexpression of VEGF and thus early stages of disease, among other potential applications.

Other functionalized polymers have neurotransmitters on them, and these films could help sense or treat brain or nervous system disorders. So far, the team has made a polymer with dopamine, which plays a role in addictive behaviors, as well as dopamine-functionalized variants of the EDOT monomer. Martin says these biological-synthetic hybrid materials might someday be useful in merging artificial intelligence with the human brain.

Ultimately, Martin says, his dream is to be able to tailor how these materials deposit on a surface and then to put them in tissue in a living organism. “The ability to do the polymerization in a controlled way inside a living organism would be fascinating.”

Robot takes contact-free measurements of patients’ vital signs

Mobile system could reduce health care workers’ exposure to Covid-19 virus.

During the current coronavirus pandemic, one of the riskiest parts of a health care worker’s job is assessing people who have symptoms of Covid-19. Researchers from MIT and Brigham and Women’s Hospital hope to reduce that risk by using robots to remotely measure patients’ vital signs.

The robots, which are controlled by a handheld device, can also carry a tablet that allows doctors to ask patients about their symptoms without being in the same room.

“In robotics, one of our goals is to use automation and robotic technology to remove people from dangerous jobs,” says Henwei Huang, an MIT postdoc. “We thought it should be possible for us to use a robot to remove the health care worker from the risk of directly exposing themselves to the patient.”

Using four cameras mounted on a dog-like robot developed by Boston Dynamics, the researchers have shown that they can measure skin temperature, breathing rate, pulse rate, and blood oxygen saturation in healthy patients, from a distance of 2 meters. They are now making plans to test it in patients with Covid-19 symptoms.

“We are thrilled to have forged this industry-academia partnership in which scientists with engineering and robotics expertise worked with clinical teams at the hospital to bring sophisticated technologies to the bedside,” says Giovanni Traverso, an MIT assistant professor of mechanical engineering, a gastroenterologist at Brigham and Women’s Hospital, and the senior author of the study.

The researchers have posted a paper on their system on the preprint server techRxiv, and have submitted it to a peer-reviewed journal. Huang is one of the lead authors of the study, along with Peter Chai, an assistant professor of emergency medicine at Brigham and Women’s Hospital, and Claas Ehmke, a visiting scholar from ETH Zurich.

Measuring vital signs

When Covid-19 cases began surging in Boston in March, many hospitals, including Brigham and Women’s, set up triage tents outside their emergency departments to evaluate people with Covid-19 symptoms. One major component of this initial evaluation is measuring vital signs, including body temperature.

The MIT and BWH researchers came up with the idea to use robotics to enable contactless monitoring of vital signs, to allow health care workers to minimize their exposure to potentially infectious patients. They decided to use existing computer vision technologies that can measure temperature, breathing rate, pulse, and blood oxygen saturation, and worked to make them mobile.

To achieve that, they used a robot known as Spot, which can walk on four legs, similarly to a dog. Health care workers can maneuver the robot to wherever patients are sitting, using a handheld controller. The researchers mounted four different cameras onto the robot — an infrared camera plus three monochrome cameras that filter different wavelengths of light.

The researchers developed algorithms that allow them to use the infrared camera to measure both elevated skin temperature and breathing rate. For body temperature, the camera measures skin temperature on the face, and the algorithm correlates that temperature with core body temperature. The algorithm also takes into account the ambient temperature and the distance between the camera and the patient, so that measurements can be taken from different distances, under different weather conditions, and still be accurate.

Measurements from the infrared camera can also be used to calculate the patient’s breathing rate. As the patient breathes in and out, wearing a mask, their breath changes the temperature of the mask. Measuring this temperature change allows the researchers to calculate how rapidly the patient is breathing.

The three monochrome cameras each filter a different wavelength of light — 670, 810, and 880 nanometers. These wavelengths allow the researchers to measure the slight color changes that result when hemoglobin in blood cells binds to oxygen and flows through blood vessels. The researchers’ algorithm uses these measurements to calculate both pulse rate and blood oxygen saturation.

“We didn’t really develop new technology to do the measurements,” Huang says. “What we did is integrate them together very specifically for the Covid application, to analyze different vital signs at the same time.”

Continuous monitoring

In this study, the researchers performed the measurements on healthy volunteers, and they are now making plans to test their robotic approach in people who are showing symptoms of Covid-19, in a hospital emergency department.

While in the near term, the researchers plan to focus on triage applications, in the longer term, they envision that the robots could be deployed in patients’ hospital rooms. This would allow the robots to continuously monitor patients and also allow doctors to check on them, via tablet, without having to enter the room. Both applications would require approval from the U.S. Food and Drug Administration.

The research was funded by the MIT Department of Mechanical Engineering and the Karl van Tassel (1925) Career Development Professorship.

Microbial contamination and efficacy of disinfection procedures of companion robots in care homes

by Hannah Louise Bradwell, Christopher W. Johnson, John Lee, Rhona Winnington, Serge Thill, Ray B. Jones in PLOS ONE

Robotic support pets used to reduce depression in older adults and people with dementia acquire bacteria over time, but a simple cleaning procedure can help them from spreading illnesses, according to a new study published August 26, 2020 in the open-access journal PLOS ONE by Hannah Bradwell of the University of Plymouth, UK and colleagues.

There is a wealth of research on the use of social robots, or companion robots, in care and long-term nursing homes. “Paro the robot seal” and other robotic animals have been linked to reductions in depression, agitation, loneliness, nursing staff stress, and medication use — especially relevant during this period of pandemic-related social isolation.

In the new study, researchers measured the microbial load found on the surface of eight different robot animals (Paro, Miro, Pleo rb, Joy for All dog, Joy for All cat, Furby Connect, Perfect Petzzz dog, and Handmade Hedgehog) after interaction with four care home residents, and again after cleaning by a researcher or care home staff member. The animals ranged in material from fur to soft plastic to solid plastic. The cleaning process involved spraying with anti-bacterial product, brushing any fur, and vigorous cleaning with anti-bacterial wipes.

Most of the devices gathered enough harmful microbes during 20 minutes of standard use to have a microbial load above the acceptable threshold of 2.5 CFU/cm2 (colony forming units per square centimetre). Only the Joy for All cat and the MiRo robot remained below this level when microbes were measured after a 48 hour incubation period; microbial loads on the other 6 robots ranged from 2.56 to 17.28 CFU/cm2. The post-cleaning microbial load, however, demonstrated that regardless of material type, previous microbial load, or who carried out the cleaning procedure, all robots could be brought to well below acceptable levels. 5 of the 8 robots had undetectable levels of microbes after cleaning and 48 hours of incubation, and the remaining 3 robots had only 0.04 to 0.08 CFU/cm2 after this protocol.

Hannah Bradwell, researcher at the Centre for Health Technology says:

“Robot pets may be beneficial for older adults and people with dementia living in care homes, likely improving wellbeing and providing company. This benefit could be particularly relevant at present, in light of social isolation, however our study has shown the strong requirement for considerations around infection control for these devices.”

Enabling Situational Awareness via Augmented Reality of Autonomous Robot-Based Environmental Change Detection

by Christopher Reardon, Jason Gregory, Carlos Nieto-Granda, John G. Rogers in 12th International Conference on Virtual, Augmented, and Mixed Reality

Something is different, and you can’t quite put your finger on it. But your robot can.

Even small changes in your surroundings could indicate danger. Imagine a robot could detect those changes, and a warning could immediately alert you through a display in your eyeglasses. That is what U.S. Army scientists are developing with sensors, robots, real-time change detection and augmented reality wearables.

Army researchers demonstrated in a real-world environment the first human-robot team in which the robot detects physical changes in 3D and shares that information with a human in real-time through augmented reality, who is then able to evaluate the information received and decide follow-on action.

“This could let robots inform their Soldier teammates of changes in the environment that might be overlooked by or not perceptible to the Soldier, giving them increased situational awareness and offset from potential adversaries,” said Dr. Christopher Reardon, a researcher at the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory. “This could detect anything from camouflaged enemy soldiers to IEDs.”

Part of the lab’s effort in contextual understanding through the Artificial Intelligence for Mobility and Maneuver Essential Research Program, this research explores how to provide contextual awareness to autonomous robotic ground platforms in maneuver and mobility scenarios. Researchers also participate with international coalition partners in the Technical Cooperation Program’s Contested Urban Environment Strategic Challenge, or TTCP CUESC, events to test and evaluate human-robot teaming technologies.

Most academic research in the use of mixed reality interfaces for human-robot teaming does not enter real-world environments, but rather uses external instrumentation in a lab to manage the calculations necessary to share information between a human and robot. Likewise, most engineering efforts to provide humans with mixed-reality interfaces do not examine teaming with autonomous mobile robots, Reardon said.

Reardon and his colleagues from the Army and the University of California, San Diego, published their research, Enabling Situational Awareness via Augmented Reality of Autonomous Robot-Based Environmental Change Detection, at the 12th International Conference on Virtual, Augmented, and Mixed Reality, part of the International Conference on Human-Computer Interaction.

The research paired a small autonomous mobile ground robot, equipped with laser ranging sensors, known as LIDAR, to build a representation of the environment, with a human teammate wearing augmented reality glasses. As the robot patrolled the environment, it compared its current and previous readings to detect changes in the environment. Those changes were then instantly displayed in the human’s eyewear to determine whether the human could interpret the changes in the environment.

In studying communication between the robot and human team, the researchers tested different resolution LIDAR sensors on the robot to collect measurements of the environment and detect changes. When those changes were shared using augmented reality to the human, the researchers found that human teammates could interpret changes that even the lower-resolution LIDARs detected. This indicates that — depending on the size of the changes expected to encounter — lighter, smaller and less expensive sensors could perform just as well, and run faster in the process.

This capability has the potential to be incorporated into future Soldier mixed-reality interfaces such as the Army’s Integrated Visual Augmentation System goggles, or IVAS.

“Incorporating mixed reality into Soldiers’ eye protection is inevitable,” Reardon said. “This research aims to fill gaps by incorporating useful information from robot teammates into the Soldier-worn visual augmentation ecosystem, while simultaneously making the robots better teammates to the Soldier.”

Future studies will continue to explore how to strengthen the teaming between humans and autonomous agents by allowing the human to interact with the detected changes, which will provide more information to the robot about the context of the change-for example, changes made by adversaries versus natural environmental changes or false positives, Reardon said. This will improve the autonomous context understanding and reasoning capabilities of the robotic platform, such as by enabling the robot to learn and predict what types of changes constitute a threat. In turn, providing this understanding to autonomy will help researchers learn how improve teaming of Soldiers with autonomous platforms.

Videos

Researchers from the Wyss Institute, Harvard SEAS, and Sony have created the mini-RCM, a small surgical robot that can help surgeons perform delicate teleoperated procedures on the human body.

Tokyo startup Telexistence has recently unveiled a new robot called the Model-T, an advanced teleoperated humanoid that can use tools and grasp a wide range of objects. Japanese convenience store chain FamilyMart plans to test the Model-T to restock shelves in up to 20 stores by 2022. In the trial, a human “pilot” will operate the robot remotely, handling items like beverage bottles, rice balls, sandwiches, and bento boxes.

Quadruped dance-off should be a new robotics competition at IROS or ICRA.

Anthony Cowley wrote in to share his recent thesis work on UPSLAM, a fast and lightweight SLAM technique that records data in panoramic depth images (just PNGs) that are easy to visualize and even easier to share between robots, even on low-bandwidth networks.

Through a hybrid of simulation and real-life training, this air muscle robot is learning to play table tennis.

The University of Michigan has a fancy new treadmill that’s built right into the floor, which proves to be a bit much for Mini Cheetah.

Upcoming events

ICUAS 2020 — September 1–4, 2020 — Athens, Greece
ICRES 2020 — September 28–29, 2020 — Taipei, Taiwan
AUVSI EXPONENTIAL 2020 — October 5–8, 2020 — [Online Conference]
IROS 2020 — October 25–29, 2020 — Las Vegas, Nev., USA
CYBATHLON 2020 — November 13–14, 2020 — [Online Event]
ICSR 2020 — November 14–16, 2020 — Golden, Colo., USA

MISC

Subscribe to Paradigm!

Medium. Twitter. Telegram. Reddit.

Main sources

Research articles

Science Robotics

Science Daily

IEEE Spectrum

--

--