RT/ Jellyfish-inspired soft robots can outswim their natural counterparts

Paradigm
Paradigm
Published in
21 min readJul 9, 2020

Robotics biweekly vol.8, 25th June — 9th July

TL;DR

  • Engineering researchers have developed soft robots inspired by jellyfish that can outswim their real-life counterparts. More practically, the new jellyfish-bots highlight a technique that uses pre-stressed polymers to make soft robots more powerful.
  • Researchers have developed a family of soft materials that imitates living creatures. When hit with light, the film-thin materials come alive — bending, rotating and even crawling on surfaces.
  • Scientists have discovered that a single, 60-minute interaction with PARO actually improved mood as well as reduced mild or severe pain. When participants touched PARO, they experienced greater pain reduction than when it was simply present in their room.
  • Researchers propose a new approach to finding an optimal solution for controlling large numbers of robots collaboratively completing a set of complex linear temporal logic commands called STyLuS*, for large-Scale optimal Temporal Logic Synthesis, that can solve problems massively larger than what current algorithms can handle, with hundreds of robots, tens of thousands of rooms and highly complex tasks, in a small fraction of the time.
  • Engineers have found a mathematical means of helping regulators and business manage and police Artificial Intelligence systems’ biases towards making unethical, and potentially very costly and damaging commercial choices — an ethical eye on AI.
  • Teaching physics to neural networks enables those networks to better adapt to chaos within their environment. The work has implications for improved artificial intelligence (AI) applications ranging from medical diagnostics to automated drone piloting.
  • CSAIL robot disinfects Greater Boston Food Bank. Using UV-C light, the system can disinfect a warehouse floor in half an hour — and could one day be employed in grocery stores, schools, and other spaces.
  • In the recent video, a mission of the Alpha Aerial Scout of Team CERBERUS during the DARPA Subterranean Challenge Urban Circuit event is presented. The Alpha Robot operates inside the Satsop Abandoned Power Plant and performs autonomous exploration. This deployment took place during the 3rd field trial of team CERBERUS during the Urban Circuit event of the DARPA Subterranean Challenge.
  • More excellent talks from the remote Legged Robots ICRA workshop.
  • The June issue of Science Robotics.
  • Check out robotics upcoming events (mostly virtual) below. And more!

Robotics market

The global market for robots is expected to grow at a compound annual growth rate (CAGR) of around 26 percent to reach just under 210 billion U.S. dollars by 2025. It is predicted that this market will hit the 100 billion U.S. dollar mark in 2020.

Size of the global market for industrial and non-industrial robots between 2018 and 2025(in billion U.S. dollars):

Size of the global market for industrial and non-industrial robots between 2018 and 2025(in billion U.S. dollars). Source: Statista

Research articles

Leveraging Monostable and Bistable Pre‐Curved Bilayer Actuators for High‐Performance Multitask Soft Robots

by Yinding Chi, Yichao Tang, Haijun Liu, Jie Yin in Advanced Materials Technologies

Engineering researchers at North Carolina State University and Temple University have developed soft robots inspired by jellyfish that can outswim their real-life counterparts. More practically, the new jellyfish-bots highlight a technique that uses pre-stressed polymers to make soft robots more powerful.

“Our previous work focused on making soft robots that were inspired by cheetahs — and while the robots were very fast, they still had a stiff inner spine,” says Jie Yin, an assistant professor of mechanical and aerospace engineering at NC State and corresponding author of a paper on the new work. “We wanted to make a completely soft robot, without an inner spine, that still utilized that concept of switching between two stable states in order to make the soft robot move more powerfully — and more quickly. And one of the animals we were inspired by was the jellyfish.”

The researchers created their new soft robots from two bonded layers of the same elastic polymer. One layer of polymer was pre-stressed, or stretched. A second layer was not pre-stressed and contained an air channel.

“We can make the robot ‘flex’ by pumping air into the channel layer, and we control the direction of that flex by controlling the relative thickness of the pre-stressed layer,” Yin says.

Here’s how it works. When combined with a third stress-free layer, called an intermediate layer, the pre-stressed layer wants to move in a particular direction. For example, you might have a piece of polymeric strip that has been pre-stressed by pulling it in two directions. After attaching the pre-stressed material to the intermediate layer, the end result would be a bilayer strip that wants to curve down, like a frowning face. If this bilayer strip, also called the pre-stressed layer, is thinner than the layer with the air channel, that frowning curve will bend into a smiling curve as air is pumped into the channel layer. However, if the pre-stressed layer is thicker than the channel layer, the frown will become more and more pronounced as air is pumped into the channel layer. Either way, once the air is allowed to leave the channel layer, the material snaps back to its original, “resting” state.

In fact, this simple example describes one of the soft robots created by the research team, a fast-moving soft crawler. It resembles a larval insect curling its body, then jumping forward as it quickly releases its stored energy.

The jellyfish-bot is slightly more complicated, with the pre-stressed disk-like layer being stretched in four directions (think of it as being pulled east and west simultaneously, then being pulled north and south simultaneously). The channel layer is also different, consisting of a ring-like air channel. The end result is a dome that looks like a jellyfish.

As the jellyfish-bot “relaxes,” the dome curves up, like a shallow bowl. When air is pumped into the channel layer, the dome quickly curves down, pushing out water and propelling itself forward. In experimental testing, the jellyfish-bot had an average speed of 53.3 millimeters per second. That’s not bad, considering that none of the three jellyfish species the researchers examined went faster than an average of 30 millimeters per second.

Lastly, the researchers created a three-pronged gripping robot — with a twist. Most grippers hang open when “relaxed,” and require energy to hold on to their cargo as it is lifted and moved from point A to point B. But Yin and his collaborators used the pre-stressed layers to create grippers whose default position is clenched shut. Energy is required to open the grippers, but once they’re in position, the grippers return to their “resting” mode — holding their cargo tight.

“The advantage here is that you don’t need energy to hold on to the object during transport — it’s more efficient,” Yin says.

Supramolecular–covalent hybrid polymers for light-activated mechanical actuation

by Chuang Li, Aysenur Iscen, Hiroaki Sai, Kohei Sato, Nicholas A. Sather, Stacey M. Chin, Zaida Álvarez, Liam C. Palmer, George C. Schatz, Samuel I. Stupp in Nature Materials

Researchers have developed a family of soft materials that imitates living creatures. When hit with light, the film-thin materials come alive — bending, rotating and even crawling on surfaces.

When hit with light, the film-thin materials come alive — bending, rotating and even crawling on surfaces.

Called “robotic soft matter by the Northwestern team,” the materials move without complex hardware, hydraulics or electricity. The researchers believe the lifelike materials could carry out many tasks, with potential applications in energy, environmental remediation and advanced medicine.

“We live in an era in which increasingly smarter devices are constantly being developed to help us manage our everyday lives,” said Northwestern’s Samuel I. Stupp, who led the experimental studies. “The next frontier is in the development of new science that will bring inert materials to life for our benefit — by designing them to acquire capabilities of living creatures.”

Stupp is the Board of Trustees Professor of Materials Science and Engineering, Chemistry, Medicine and Biomedical Engineering at Northwestern and director of the Simpson Querrey Institute He has appointments in the McCormick School of Engineering, Weinberg College of Arts and Sciences and Feinberg School of Medicine. George Schatz, the Charles E. and Emma H. Morrison Professor of Chemistry in Weinberg, led computer simulations of the materials’ lifelike behaviors. Postdoctoral fellow Chuang Li and graduate student Aysenur Iscen, from the Stupp and Schatz laboratories, respectively, are co-first authors of the paper.

Although the moving material seems miraculous, sophisticated science is at play. Its structure comprises nanoscale peptide assemblies that drain water molecules out of the material. An expert in materials chemistry, Stupp linked the peptide arrays to polymer networks designed to be chemically responsive to blue light.

When light hits the material, the network chemically shifts from hydrophilic (attracts water) to hydrophobic (resists water). As the material expels the water through its peptide “pipes,” it contracts — and comes to life. When the light is turned off, water re-enters the material, which expands as it reverts to a hydrophilic structure.

This is reminiscent of the reversible contraction of muscles, which inspired Stupp and his team to design the new materials.

“From biological systems, we learned that the magic of muscles is based on the connection between assemblies of small proteins and giant protein polymers that expand and contract,” Stupp said. “Muscles do this using a chemical fuel rather than light to generate mechanical energy.”

For Northwestern’s bio-inspired material, localized light can trigger directional motion. In other words, bending can occur in different directions, depending on where the light is located. And changing the direction of the light also can force the object to turn as it crawls on a surface.

Stupp and his team believe there are endless possible applications for this new family of materials. With the ability to be designed in different shapes, the materials could play a role in a variety of tasks, ranging from environmental clean-up to brain surgery.

“These materials could augment the function of soft robots needed to pick up fragile objects and then release them in a precise location,” he said. “In medicine, for example, soft materials with ‘living’ characteristics could bend or change shape to retrieve blood clots in the brain after a stroke. They also could swim to clean water supplies and sea water or even undertake healing tasks to repair defects in batteries, membranes and chemical reactors.”

Touching the social robot PARO reduces pain perception and salivary oxytocin levels

by Nirit Geva, Florina Uzefovsky, Shelly Levy-Tzedek in Scientific Reports

Researchers have discovered that a single, 60-minute interaction with PARO actually improved mood as well as reduced mild or severe pain. When participants touched PARO, they experienced greater pain reduction than when it was simply present in their room.

Human-human social touch improves mood and alleviates pain. No studies have so far tested the effect of human-robot emotional touch on experimentally induced pain ratings, on mood and on oxytocin levels in healthy young adults. Here, we assessed the effect of touching the robot PARO on pain perception, on mood and on salivary oxytocin levels, in 83 young adults. We measured their perceived pain, happiness state, and salivary oxytocin. For the 63 participants in the PARO group, pain was assessed in three conditions: Baseline, Touch (touching PARO) and No-Touch (PARO present). The control group (20 participants) underwent the same measurements without ever encountering PARO. There was a decrease in pain ratings and in oxytocin levels and an increase in happiness ratings compared to baseline only in the PARO group. The Touch condition yielded a larger decrease in pain ratings compared to No-Touch. These effects correlated with the participants’ positive perceptions of the interaction with PARO. Participants with higher perceived ability to communicate with PARO experienced a greater hypoalgesic effect when touching PARO. We show that human-robot social touch is effective in reducing pain ratings, improving mood and — surprisingly — reducing salivary oxytocin levels in adults.

An illustration of the experimental setup. The participant (on the left) has the heat stimulator placed on her non-dominant arm, which is placed on the table. The experimenter (on the right) administers the accurate heat stimuli, and tracks them on the screen. (A) Baseline condition; PARO is not present. (B) No-Touch condition; PARO is present in the room, without physical contact with the participant. © Touch condition. PARO is placed on the table next to the participant, who touches it during the administration of the heat stimuli. In the control group, PARO was not present during the entire experimental session, as in (A).

STyLuS*: A Temporal Logic Optimal Control Synthesis Algorithm for Large-Scale Multi-Robot Systems

by Yiannis Kantaros, Michael M Zavlanos in The International Journal of Robotics Research

Researchers propose a new approach to finding an optimal solution for controlling large numbers of robots collaboratively completing a set of complex linear temporal logic commands called STyLuS*, for large-Scale optimal Temporal Logic Synthesis, that can solve problems massively larger than what current algorithms can handle, with hundreds of robots, tens of thousands of rooms and highly complex tasks, in a small fraction of the time.

This article proposes a new highly scalable and asymptotically optimal control synthesis algorithm from linear temporal logic specifications, called STyLuS∗ for large-Scale optimal Temporal Logic Synthesis, that is designed to solve complex temporal planning problems in large-scale multi-robot systems. Existing planning approaches with temporal logic specifications rely on graph search techniques applied to a product automaton constructed among the robots. In the previous work, researchers have proposed a more tractable sampling-based algorithm that builds incrementally trees that approximate the state space and transitions of the synchronous product automaton and does not require sophisticated graph search techniques. Here, they extend the previous work by introducing bias in the sampling process that is guided by transitions in the Büchi automaton that belong to the shortest path to the accepting states. This allows them to synthesize optimal motion plans from product automata with hundreds of orders of magnitude more states than those that existing optimal control synthesis methods or off-the-shelf model checkers can manipulate. Scientists show that STyLuS∗ is probabilistically complete and asymptotically optimal and has exponential convergence rate. This is the first time that convergence rate results are provided for sampling-based optimal control synthesis methods. They provide simulation results that show that STyLuS∗ can synthesize optimal motion plans for very large multi-robot systems, which is impossible using state-of-the-art methods.

An unethical optimization principle

by Nicholas Beale, Heather Battey, Anthony C. Davison, Robert S. MacKay in Royal Society Open Science

Researchers from the University of Warwick, Imperial College London, EPFL (Lausanne) and Sciteb Ltd have found a mathematical means of helping regulators and business manage and police Artificial Intelligence systems’ biases towards making unethical, and potentially very costly and damaging commercial choices — an ethical eye on AI.

Artificial intelligence (AI) is increasingly deployed in commercial situations. Consider for example using AI to set prices of insurance products to be sold to a particular customer. There are legitimate reasons for setting different prices for different people, but it may also be profitable to ‘game’ their psychology or willingness to shop around.

The AI has a vast number of potential strategies to choose from, but some are unethical and will incur not just moral cost but a significant potential economic penalty as stakeholders will apply some penalty if they find that such a strategy has been used — regulators may levy significant fines of billions of Dollars, Pounds or Euros and customers may boycott you — or both.

So in an environment in which decisions are increasingly made without human intervention, there is therefore a very strong incentive to know under what circumstances AI systems might adopt an unethical strategy and reduce that risk or eliminate entirely if possible.

Mathematicians and statisticians from University of Warwick, Imperial, EPFL and Sciteb Ltd have come together to help business and regulators creating a new “Unethical Optimization Principle” and provide a simple formula to estimate its impact. They have laid out the full details in a paper bearing the name “An unethical optimization principle”.

The four authors of the paper are Nicholas Beale of Sciteb Ltd; Heather Battey of the Department of Mathematics, Imperial College London; Anthony C. Davison of the Institute of Mathematics, Ecole Polytechnique Fédérale de Lausanne; and Professor Robert MacKay of the Mathematics Institute of the University of Warwick.

Professor Robert MacKay of the Mathematics Institute of the University of Warwick said:

“Our suggested ‘Unethical Optimization Principle’ can be used to help regulators, compliance staff and others to find problematic strategies that might be hidden in a large strategy space. Optimisation can be expected to choose disproportionately many unethical strategies, inspection of which should show where problems are likely to arise and thus suggest how the AI search algorithm should be modified to avoid them in future.

“The Principle also suggests that it may be necessary to re-think the way AI operates in very large strategy spaces, so that unethical outcomes are explicitly rejected in the optimization/learning process.”

Physics-enhanced neural networks learn order and chaos

by Anshul Choudhary, John F. Lindner, Elliott G. Holliday, Scott T. Miller, Sudeshna Sinha, William L. Ditto in Physical Review E

Researchers from North Carolina State University have discovered that teaching physics to neural networks enables those networks to better adapt to chaos within their environment. The work has implications for improved artificial intelligence (AI) applications ranging from medical diagnostics to automated drone piloting.

Neural networks are an advanced type of AI loosely based on the way that our brains work. Our natural neurons exchange electrical impulses according to the strengths of their connections. Artificial neural networks mimic this behavior by adjusting numerical weights and biases during training sessions to minimize the difference between their actual and desired outputs. For example, a neural network can be trained to identify photos of dogs by sifting through a large number of photos, making a guess about whether the photo is of a dog, seeing how far off it is and then adjusting its weights and biases until they are closer to reality.

The drawback to this neural network training is something called “chaos blindness” — an inability to predict or respond to chaos in a system. Conventional AI is chaos blind. But researchers from NC State’s Nonlinear Artificial Intelligence Laboratory (NAIL) have found that incorporating a Hamiltonian function into neural networks better enables them to “see” chaos within a system and adapt accordingly.

Simply put, the Hamiltonian embodies the complete information about a dynamic physical system — the total amount of all the energies present, kinetic and potential. Picture a swinging pendulum, moving back and forth in space over time. Now look at a snapshot of that pendulum. The snapshot cannot tell you where that pendulum is in its arc or where it is going next. Conventional neural networks operate from a snapshot of the pendulum. Neural networks familiar with the Hamiltonian flow understand the entirety of the pendulum’s movement — where it is, where it will or could be, and the energies involved in its movement.

In a proof-of-concept project, the NAIL team incorporated Hamiltonian structure into neural networks, then applied them to a known model of stellar and molecular dynamics called the Hénon-Heiles model. The Hamiltonian neural network accurately predicted the dynamics of the system, even as it moved between order and chaos.

“The Hamiltonian is really the ‘special sauce’ that gives neural networks the ability to learn order and chaos,” says John Lindner, visiting researcher at NAIL, professor of physics at The College of Wooster and corresponding author of a paper describing the work. “With the Hamiltonian, the neural network understands underlying dynamics in a way that a conventional network cannot. This is a first step toward physics-savvy neural networks that could help us solve hard problems.”

The work appears in Physical Review E and is supported in part by the Office of Naval Research (grant N00014–16–1–3066). NC State postdoctoral researcher Anshul Choudhary is first author. Bill Ditto, professor of physics at NC State, is director of NAIL. Visiting researcher Scott Miller; Sudeshna Sinha, from the Indian Institute of Science Education and Research Mohali; and NC State graduate student Elliott Holliday also contributed to the work.

CSAIL robot disinfects Greater Boston Food Bank

Using UV-C light, the system can disinfect a warehouse floor in half an hour — and could one day be employed in grocery stores, schools, and other spaces.

With every droplet that we can’t see, touch, or feel dispersed into the air, the threat of spreading Covid-19 persists. It’s become increasingly critical to keep these heavy droplets from lingering — especially on surfaces, which are welcoming and generous hosts.

Thankfully, our chemical cleaning products are effective, but using them to disinfect larger settings can be expensive, dangerous, and time-consuming. Across the globe there are thousands of warehouses, grocery stores, schools, and other spaces where cleaning workers are at risk.

With that in mind, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), in collaboration with Ava Robotics and the Greater Boston Food Bank (GBFB), designed a new robotic system that powerfully disinfects surfaces and neutralizes aerosolized forms of the coronavirus.

The approach uses a custom UV-C light fixture designed at CSAIL that is integrated with Ava Robotics’ mobile robot base. The results were encouraging enough that researchers say that the approach could be useful for autonomous UV disinfection in other environments, such as factories, restaurants, and supermarkets.

UV-C light has proven to be effective at killing viruses and bacteria on surfaces and aerosols, but it’s unsafe for humans to be exposed. Fortunately, Ava’s telepresence robot doesn’t require any human supervision. Instead of the telepresence top, the team subbed in a UV-C array for disinfecting surfaces. Specifically, the array uses short-wavelength ultraviolet light to kill microorganisms and disrupt their DNA in a process called ultraviolet germicidal irradiation.

The complete robot system is capable of mapping the space — in this case, GBFB’s warehouse — and navigating between waypoints and other specified areas. In testing the system, the team used a UV-C dosimeter, which confirmed that the robot was delivering the expected dosage of UV-C light predicted by the model.

“Food banks provide an essential service to our communities, so it is critical to help keep these operations running,” says Alyssa Pierson, CSAIL research scientist and technical lead of the UV-C lamp assembly. “Here, there was a unique opportunity to provide additional disinfecting power to their current workflow, and help reduce the risks of Covid-19 exposure.”

Food banks are also facing a particular demand due to the stress of Covid-19. The United Nations projected that, because of the virus, the number of people facing severe food insecurity worldwide could double to 265 million. In the United States alone, the five-week total of job losses has risen to 26 million, potentially pushing millions more into food insecurity.

During tests at GBFB, the robot was able to drive by the pallets and storage aisles at a speed of roughly 0.22 miles per hour. At this speed, the robot could cover a 4,000-square-foot space in GBFB’s warehouse in just half an hour. The UV-C dosage delivered during this time can neutralize approximately 90 percent of coronaviruses on surfaces. For many surfaces, this dose will be higher, resulting in more of the virus neutralized.

Typically, this method of ultraviolet germicidal irradiation is used largely in hospitals and medical settings, to sterilize patient rooms and stop the spread of microorganisms like methicillin-resistant staphylococcus aureus and Clostridium difficile, and the UV-C light also works against airborne pathogens. While it’s most effective in the direct “line of sight,” it can get to nooks and crannies as the light bounces off surfaces and onto other surfaces.

“Our 10-year-old warehouse is a relatively new food distribution facility with AIB-certified, state-of-the-art cleanliness and food safety standards,” says Catherine D’Amato, president and CEO of the Greater Boston Food Bank. “Covid-19 is a new pathogen that GBFB, and the rest of the world, was not designed to handle. We are pleased to have this opportunity to work with MIT CSAIL and Ava Robotics to innovate and advance our sanitation techniques to defeat this menace.”

As a first step, the team teleoperated the robot to teach it the path around the warehouse — meaning it’s equipped with autonomy to move around, without the team needing to navigate it remotely.

It can go to defined waypoints on its map, such as going to the loading dock, then the warehouse shipping floor, then returning to base. They define those waypoints from the expert human user in teleop mode, and then can add new waypoints to the map as needed.

Within GBFB, the team identified the warehouse shipping floor as a “high-importance area” for the robot to disinfect. Each day, workers stage aisles of products and arrange them for up to 50 pickups by partners and distribution trucks the next day. By focusing on the shipping area, it prioritizes disinfecting items leaving the warehouse to reduce Covid-19 spread out into the community.

Currently, the team is exploring how to use its onboard sensors to adapt to changes in the environment, such that in new territory, the robot would adjust its speed to ensure the recommended dosage is applied to new objects and surfaces.

A unique challenge is that the shipping area is constantly changing, so each night, the robot encounters a slightly new environment. When the robot is deployed, it doesn’t necessarily know which of the staging aisles will be occupied, or how full each aisle might be. Therefore, the team notes that they need to teach the robot to differentiate between the occupied and unoccupied aisles, so it can change its planned path accordingly.

As far as production went, “in-house manufacturing” took on a whole new meaning for this prototype and the team. The UV-C lamps were assembled in Pierson’s basement, and CSAIL PhD student Jonathan Romanishin crafted a makeshift shop in his apartment for the electronics board assembly.

“As we drive the robot around the food bank, we are also researching new control policies that will allow the robot to adapt to changes in the environment and ensure all areas receive the proper estimated dosage,” says Pierson. “We are focused on remote operation to minimize human supervision, and, therefore, the additional risk of spreading Covid-19, while running our system.”

Team develops remote specimen collection robot

The Korea Institute of Machinery & Materials (KIMM) under the Ministry of Science and ICT developed a remote specimen collection robot that eliminates direct contact between medical personnel and patients.

The team led by Dr. Joonho Seo of the Department of Medical Assistant Robot, Daegu Research Center for Medical Devices and Green Energy at KIMM collaborated with a team under Professor Nam-Hee Kim of Dongguk University College of Medicine to develop a remote robot system that collects samples from the upper airway of persons presenting symptoms.

The robot system is composed of a remote device controlled by medical personnel and a robot that interacts with patients. The robot is equipped with a disposable swab, which retrieves samples from the nose and mouth of a patient, and moves according to the operation of the master device.

The team applied remote control technology of parallel robots for remote sampling. The sampling swab of the robot moves or rotates according to the operation of the master device, and retrieves samples when inserted into the mouth or nose. The system also supports video and audio communication between patients and physicians.

Medical personnel can directly operate the robot while viewing the position of the swab in a patient’s nose or mouth on the camera. The force of swab insertion can be remotely monitored, which further enhances sampling accuracy and safety.

Slave robot of the non-face-to-face sampling system developed by Dr. Joonho Seo of KIMM. Credit: Korea Institute of Machinery & Materials (KIMM)

The system is expected to significantly reduce the risk of infection of COVID-19 and other high-risk diseases among medical personnel. The robot is affordable and as small as a patient’s face, making it a feasible solution for various medical settings.

Dr. Joonho Seo of KIMM said, “This technology allows samples to be retrieved from persons presenting symptoms of high-risk diseases even without direct contact. I expect it to be useful in the screening of high-risk diseases like COVID-19, and hope it will contribute to the safety and well-being of medical personnel during pandemics and epidemics.”

The KIMM research team led by Dr. Joonho Seo is performing a demonstration of the non-face-to-face remote sampling robot. When the operator moves the master device, the slave robot at the remote site retrieves a sample according to the operator’s movement. Credit: Korea Institute of Machinery & Materials (KIMM)

Professor Nam-Hee Kim of Dongguk University also said that “The robot not only lowers the risk of infection among medical personnel, but also removes the need to wear cumbersome protective gear. I believe it will have diverse clinical applications, especially in the diagnosis of infectious diseases.”

The June issue of Science Robotics is out!

Presenting a mini “lab” suspended on cables that moves in tandem with its flying specimen; platelet microrobots that can self-propel to target pathogens; and more to come:

Repurposing factories with robotics in the face of COVID-19: Can collaborative robots ramp up the production of medical ventilators?

Abstract Full Text

Immune evasion by designer microrobots: Recent work is unveiling the interactions between magnetic microswimmers and cells of the immune system.

Abstract Full Text

Drones against vector-borne diseases: Uncrewed aerial vehicles can reduce the cost of preventative measures against vector-borne diseases.

Abstract Full Text

Transforming platelets into microrobots: Biocompatible cell robots powered by urea improve drug delivery through active movement.

Abstract Full Text

Elucidating the interaction dynamics between microswimmer body and immune system for medical microrobots: Morphology-dependent immunogenicity obliges a compromise on the locomotion-focused design of medical microrobots.

Abstract Full Text

Field performance of sterile male mosquitoes released from an uncrewed aerial vehicle: An automatic adult mosquito release device operated from a drone released sterile males without reducing their quality.

Abstract Full Text

Videos

While it’s hard enough to get quadrupedal robots to walk in complex environments, this work from the Robotic Systems Lab at ETH Zurich shows some impressive whole body planning that allows ANYmal to squeeze its body through small or weirdly shaped spaces.

As control skills increase, we are more and more impressed by what a Cassie bipedal robot can do. Later this year, you will see this controller integrated with our real-time planner and perception system.

GITAI’s S1 arm is a little less exciting than their humanoid torso, but it looks like this one might actually be going to the ISS next year.

In this video, a mission of the Alpha Aerial Scout of Team CERBERUS during the DARPA Subterranean Challenge Urban Circuit event is presented. The Alpha Robot operates inside the Satsop Abandoned Power Plant and performs autonomous exploration. This deployment took place during the 3rd field trial of team CERBERUS during the Urban Circuit event of the DARPA Subterranean Challenge.

More excellent talks from the remote Legged Robots ICRA workshop.

Upcoming events

CLAWAR 2020 — August 24–26, 2020 — [Virtual Conference]

ICUAS 2020 — September 1–4, 2020 — Athens, Greece

ICRES 2020 — September 28–29, 2020 — Taipei, Taiwan

IROS 2020 — October 25–29, 2020 — Las Vegas, Nevada

ICSR 2020 — November 14–16, 2020 — Golden, Colorado

MISC

Subscribe to Paradigm!

Medium. Twitter. Telegram. Reddit.

Main sources

Research articles

Science Robotics

Science Daily

IEEE Spectrum

--

--