RT/ Scientists design a two-legged robot powered by muscle tissue

Paradigm
Paradigm
Published in
24 min readFeb 9, 2024

Robotics & AI biweekly vol.89, 19th January — 9th February

TL;DR

  • Compared to robots, human bodies are flexible, capable of fine movements, and can convert energy efficiently into movement. Drawing inspiration from human gait, researchers from Japan crafted a two-legged biohybrid robot by combining muscle tissues and artificial materials. This method allows the robot to walk and pivot.
  • Researchers have developed a robotic sensor that incorporates artificial intelligence techniques to read braille at speeds roughly double that of most human readers.
  • Chemists have developed an autonomous chemical synthesis robot with an integrated AI-driven machine learning unit. Dubbed ‘RoboChem’, the benchtop device can outperform a human chemist in terms of speed and accuracy while also displaying a high level of ingenuity. As the first of its kind, it could significantly accelerate chemical discovery of molecules for pharmaceutical and many other applications.
  • Two insect-like robots, a mini-bug and a water sprider, developed by researchers at Washington State University, are the smallest, lightest and fastest fully functional micro-robots ever known to be created.
  • Scientists developed a soft fluidic switch using an ionic polymer artificial muscle that runs with ultra-low power to lift objects 34 times greater than its weight. Its light weight and small size make it applicable to various industrial fields such as soft electronics, smart textiles, and biomedical devices by controlling fluid flow with high precision, even in narrow spaces.
  • An innovative study explores the use of robotic-assisted joint replacement in revision knee scenarios, comparing the pre- and post-revision implant positions in a series of revision total knee arthroplasties (TKA) using a state-of-the-art robotic arm system.
  • Researchers have published a strategy for identifying new targets for immunotherapy through artificial intelligence.
  • A group of stroke survivors in British Columbia testing a new technology designed to aid their recovery, and ultimately restore use of their limbs and hands. Participants wear a new groundbreaking ‘smart glove’ capable of tracking their hand and finger movements during rehabilitation exercises.
  • Researchers have developed a bio-logger for seabirds that enables long-term observation of rare behaviors. The bio-logger employs low-power depth sensors and accelerometers to identify rare behavior using a light-weight outlier detection model and records the behavior in a 5-min video. Observations using the bio-loggers on Streaked Shearwaters revealed novel aspects of head-shaking and foraging strategies. This approach will enable a wider range of animal behaviors in various environments to be observed.
  • Artificial intelligence using neural networks performs calculations digitally with the help of microelectronic chips. Physicists have now created a type of neural network that works not with electricity but with so-called active colloidal particles.The researchers describe how these microparticles can be used as a physical system for artificial intelligence and the prediction of time series.
  • And more!

Robotics market

The global market for robots is expected to grow at a compound annual growth rate (CAGR) of around 26 percent to reach just under 210 billion U.S. dollars by 2025.

Size of the global market for industrial and non-industrial robots between 2018 and 2025 (in billion U.S. dollars):

Size of the global market for industrial and non-industrial robots between 2018 and 2025 (in billion U.S. dollars). Source: Statista

Latest News & Research

Biohybrid bipedal robot powered by skeletal muscle tissue

by Ryuki Kinjo, Yuya Morimoto, Byeongwook Jo, Shoji Takeuchi in Matter

Compared to robots, human bodies are flexible, capable of fine movements, and can convert energy efficiently into movement. Drawing inspiration from human gait, researchers from Japan crafted a two-legged biohybrid robot by combining muscle tissues and artificial materials. This method allows the robot to walk and pivot.

“Research on biohybrid robots, which are a fusion of biology and mechanics, is recently attracting attention as a new field of robotics featuring biological function,” says corresponding author Shoji Takeuchi of the University of Tokyo, Japan. “Using muscle as actuators allows us to build a compact robot and achieve efficient, silent movements with a soft touch.”

The research team’s two-legged robot, an innovative bipedal design, builds on the legacy of biohybrid robots that take advantage of muscles. Muscle tissues have driven biohybrid robots to crawl and swim straight forward and make turns — but not sharp ones. Yet, being able to pivot and make sharp turns is an essential feature for robots to avoid obstacles.

To build a nimbler robot with fine and delicate movements, the researchers designed a biohybrid robot that mimics human gait and operates in water. The robot has a foam buoy top and weighted legs to help it stand straight underwater. The skeleton of the robot is mainly made from silicone rubber that can bend and flex to conform to muscle movements. The researchers then attached strips of lab-grown skeletal muscle tissues to the silicone rubber and each leg.

When the researchers zapped the muscle tissue with electricity, the muscle contracted, lifting the leg up. The heel of the leg then landed forward when the electricity dissipated. By alternating the electric stimulation between the left and right leg every 5 seconds, the biohybrid robot successfully “walked” at the speed of 5.4 mm/min (0.002 mph). To turn, researchers repeatedly zapped the right leg every 5 seconds while the left leg served as an anchor. The robot made a 90-degree left turn in 62 seconds. The findings showed that the muscle-driven bipedal robot can walk, stop, and make fine-tuned turning motions.

“Currently, we are manually moving a pair of electrodes to apply an electric field individually to the legs, which takes time,” says Takeuchi. “In the future, by integrating the electrodes into the robot, we expect to increase the speed more efficiently.”

The team also plans to give joints and thicker muscle tissues to the bipedal robot to enable more sophisticated and powerful movements. But before upgrading the robot with more biological components, Takeuchi says the team will have to integrate a nutrient supply system to sustain the living tissues and device structures that allow the robot to operate in the air.

“A cheer broke out during our regular lab meeting when we saw the robot successfully walk on the video,” says Takeuchi. “Though they might seem like small steps, they are, in fact, giant leaps forward for the biohybrid robots.”

High-Speed Tactile Braille Reading via Biomimetic Sliding Interactions

by Parth Potdar, David Hardman, Elijah Almanzor, Fumiya Iida in IEEE Robotics and Automation Letters

Researchers have developed a robotic sensor that incorporates artificial intelligence techniques to read braille at speeds roughly double that of most human readers.

The research team, from the University of Cambridge, used machine learning algorithms to teach a robotic sensor to quickly slide over lines of braille text. The robot was able to read the braille at 315 words per minute at close to 90% accuracy.

Although the robot braille reader was not developed as an assistive technology, the researchers say the high sensitivity required to read braille makes it an ideal test in the development of robot hands or prosthetics with comparable sensitivity to human fingertips.

Human fingertips are remarkably sensitive and help us gather information about the world around us. Our fingertips can detect tiny changes in the texture of a material or help us know how much force to use when grasping an object: for example, picking up an egg without breaking it or a bowling ball without dropping it.

Reproducing that level of sensitivity in a robotic hand, in an energy-efficient way, is a big engineering challenge. In Professor Fumiya Iida’s lab in Cambridge’s Department of Engineering, researchers are developing solutions to this and other skills that humans find easy, but robots find difficult.

“The softness of human fingertips is one of the reasons we’re able to grip things with the right amount of pressure,” said Parth Potdar from Cambridge’s Department of Engineering and an undergraduate at Pembroke College, the paper’s first author. “For robotics, softness is a useful characteristic, but you also need lots of sensor information, and it’s tricky to have both at once, especially when dealing with flexible or deformable surfaces.”

Braille is an ideal test for a robot ‘fingertip’ as reading it requires high sensitivity, since the dots in each representative letter pattern are so close together. The researchers used an off-the-shelf sensor to develop a robotic braille reader that more accurately replicates human reading behaviour.

“There are existing robotic braille readers, but they only read one letter at a time, which is not how humans read,” said co-author David Hardman, also from the Department of Engineering. “Existing robotic braille readers work in a static way: they touch one letter pattern, read it, pull up from the surface, move over, lower onto the next letter pattern, and so on. We want something that’s more realistic and far more efficient.”

The robotic sensor the researchers used has a camera in its ‘fingertip’, and reads by using a combination of the information from the camera and the sensors. “This is a hard problem for roboticists as there’s a lot of image processing that needs to be done to remove motion blur, which is time and energy-consuming,” said Potdar.

The team developed machine learning algorithms so the robotic reader would be able to ‘deblur’ the images before the sensor attempted to recognise the letters. They trained the algorithm on a set of sharp images of braille with fake blur applied. After the algorithm had learned to deblur the letters, they used a computer vision model to detect and classify each character. Once the algorithms were incorporated, the researchers tested their reader by sliding it quickly along rows of braille characters. The robotic braille reader could read at 315 words per minute at 87% accuracy, which is twice as fast and about as accurate as a human Braille reader.

“Considering that we used fake blur the train the algorithm, it was surprising how accurate it was at reading braille,” said Hardman. “We found a nice trade-off between speed and accuracy, which is also the case with human readers.”

“Braille reading speed is a great way to measure the dynamic performance of tactile sensing systems, so our findings could be applicable beyond braille, for applications like detecting surface textures or slippage in robotic manipulation,” said Potdar.

Automated self-optimization, intensification, and scale-up of photocatalysis in flow

by Aidan Slattery, Zhenghui Wen, Pauline Tenblad, Jesús Sanjosé-Orduna, Diego Pintossi, Tim den Hartog, Timothy Noël in Science

Chemists of the University of Amsterdam (UvA) have developed an autonomous chemical synthesis robot with an integrated AI-driven machine learning unit. Dubbed ‘RoboChem’, the benchtop device can outperform a human chemist in terms of speed and accuracy while also displaying a high level of ingenuity. As the first of its kind, it could significantly accelerate chemical discovery of molecules for pharmaceutical and many other applications.

RoboChem was developed by the group of Prof. Timothy Noël at the UvA’s Vant Hoff Institute for Molecular Sciences. Their paper shows that RoboChem is a precise and reliable chemist that can perform a variety of reactions while producing minimal amounts of waste. Working autonomously around the clock, the system delivers results quickly and tirelessly.

Noël: ‘In a week, we can optimise the synthesis of about ten to twenty molecules. This would take a PhD student several months.’ The robot not only yields the best reaction conditions, but also provides the settings for scale-up. ‘This means we can produce quantities that are directly relevant for suppliers to the pharmaceutical industry, for example.’

The expertise of the Noël group is in flow chemistry, a novel way of performing chemistry where a system of small, flexible tubes replaces beakers, flasks and other traditional chemistry tools. In RoboChem, a robotic needle carefully collects starting materials and mixes these together in small volumes of just over half a millilitre. These then flow through the tubing system towards the reactor. There, the light from powerful LEDs triggers the molecular conversion by activating a photocatalyst included in the reaction mixture. The flow then continues towards an automated NMR spectrometer that identifies the transformed molecules. These data are fed back in real-time to the computer that controls RoboChem.

‘This is the brain behind RoboChem,’ says Noël. ‘It processes the information using artificial intelligence. We use a machine learning algorithm that autonomously determines which reactions to perform. It always aims for the optimal outcome and constantly refines its understanding of the chemistry.’

RoboChem explores chemical reactions in a flow system that includes a LED-powered photochemical reactor. Image: UvA/HIMS.

The group put a lot of effort into substantiating RoboChem’s results. All of the molecules now included in the Science paper were isolated and checked manually. Noël says the system has impressed him with its ingenuity: ‘I have been working on photocatalysis for more than a decade now. Still, RoboChem has shown results that I would not have been able to predict. For instance, it has identified reactions that require only very little light. At times I had to scratch my head to fathom what it had done. You then wonder: would we have done it the same way? In retrospect, you see RoboChem’s logic. But I doubt if we would have obtained the same results ourselves. Or not as quickly, at least.’

The researchers also used RoboChem to replicate previous research published in four randomly selected papers. They then determined whether Robochem produced the same — or better — results.

‘In about 80% of the cases, the system produced better yields. For the other 20%, the results were similar,’ Noël says. ‘This leaves me with no doubt that an AI-assisted approach will be beneficial to chemical discovery in the broadest possible sense.’

According to Noël, the relevance of RoboChem and other ‘computerised’ chemistry also lies in the generation of high-quality data, which will benefit the future use of AI. ‘In traditional chemical discovery only a few molecules are thoroughly researched. Results are then extrapolated to seemingly similar molecules. RoboChem produces a complete and comprehensive dataset where all relevant parameters are obtained for each individual molecule. That provides much more insight.’

Another feature is that the system also records ‘negative’ data. In current scientific practice, most published data only reflects successful experiments.

‘A failed experiment also provides relevant data,’ says Noël. ‘But this can only be found in the researchers’ handwritten lab notes. These are not published and thus unavailable for AI-powered chemistry. RoboChem will change that, too. I have no doubt that if you want to make breakthroughs in chemistry with AI, you will need these kinds of robots.’

A New 1-mg Fast Unimorph SMA-Based Actuator for Microrobotics

by Conor K. Trygstad, Xuan-Truc Nguyen, Néstor O. Pérez-Arancibia in Proceedings of the IEEE Robotics and Automation Society’s International Conference on Intelligent Robots and Systems

Two insect-like robots, a mini-bug and a water strider, developed at Washington State University, are the smallest, lightest and fastest fully functional micro-robots ever known to be created.

Such miniature robots could someday be used for work in areas such as artificial pollination, search and rescue, environmental monitoring, micro-fabrication or robotic-assisted surgery. Reporting on their work in the proceedings of the IEEE Robotics and Automation Society’s International Conference on Intelligent Robots and Systems, the mini-bug weighs in at eight milligrams while the water strider weighs 55 milligrams. Both can move at about six millimeters a second.

“That is fast compared to other micro-robots at this scale although it still lags behind their biological relatives,” said Conor Trygstad, a PhD student in the School of Mechanical and Materials Engineering and lead author on the work. An ant typically weighs up to five milligrams and can move at almost a meter per second.

The WaterStrider weighs 55 milligrams and can move at 6 millimeters per second (photo by Bob Hubner, WSU Photo Services).

The key to the tiny robots is their tiny actuators that make the robots move. Trygstad used a new fabrication technique to miniaturize the actuator down to less than a milligram, the smallest ever known to have been made.

“The actuators are the smallest and fastest ever developed for micro-robotics,” said Néstor O. Pérez-Arancibia, Flaherty Associate Professor in Engineering at WSU’s School of Mechanical and Materials Engineering who led the project.

The actuator uses a material called a shape memory alloy that is able to change shapes when it’s heated. It is called ‘shape memory’ because it remembers and then returns to its original shape. Unlike a typical motor that would move a robot, these alloys don’t have any moving parts or spinning components.

“They’re very mechanically sound,” said Trygstad. “The development of the very lightweight actuator opens up new realms in micro-robotics.”

Shape memory alloys are not generally used for large-scale robotic movement because they are too slow. In the case of the WSU robots, however, the actuators are made of two tiny shape memory alloy wires that are 1/1000 of an inch in diameter. With a small amount of current, the wires can be heated up and cooled easily, allowing the robots to flap their fins or move their feet at up to 40 times per second. In preliminary tests, the actuator was also able to lift more than 150 times its own weight.

Compared to other technologies used to make robots move, the SMA technology also requires only a very small amount of electricity or heat to make them move.

“The SMA system requires a lot less sophisticated systems to power them,” said Trygstad.

Trygstad, an avid fly fisherman, has long observed water striders and would like to further study their movements. While the WSU water strider robot does a flat flapping motion to move itself, the natural insect does a more efficient rowing motion with its legs, which is one of the reasons that the real thing can move much faster.

The researchers would like to copy another insect and develop a water strider-type robot that can move across the top of the water surface as well as just under it. They are also working to use tiny batteries or catalytic combustion to make their robots fully autonomous and untethered from a power supply.

Polysulfonated covalent organic framework as active electrode host for mobile cation guests in electrochemical soft actuator

by Manmatha Mahato, Mousumi Garai, Van Hiep Nguyen, Saewoong Oh, Sanghee Nam, Xiangrong Zeng, Hyunjoon Yoo, Rassoul Tabassian, Il-Kwon Oh in Science Advances

Soft robots, medical devices, and wearable devices have permeated our daily lives. KAIST researchers have developed a fluid switch using ionic polymer artificial muscles that operates at ultra-low power and produces a force 34 times greater than its weight. Fluid switches control fluid flow, causing the fluid to flow in a specific direction to invoke various movements.

KAIST (President Kwang-Hyung Lee) announced on the 4th of January that a research team under Professor IlKwon Oh from the Department of Mechanical Engineering has developed a soft fluidic switch that operates at ultra-low voltage and can be used in narrow spaces.

Artificial muscles imitate human muscles and provide flexible and natural movements compared to traditional motors, making them one of the basic elements used in soft robots, medical devices, and wearable devices. These artificial muscles create movements in response to external stimuli such as electricity, air pressure, and temperature changes, and in order to utilize artificial muscles, it is important to control these movements precisely.

Switches based on existing motors were difficult to use within limited spaces due to their rigidity and large size. In order to address these issues, the research team developed an electro-ionic soft actuator that can control fluid flow while producing large amounts of force, even in a narrow pipe, and used it as a soft fluidic switch.

Synthesis and fundamental use of pS-COFs as common electrode-electrolyte host for electroactive soft fluidic switch.

The ionic polymer artificial muscle developed by the research team is composed of metal electrodes and ionic polymers, and it generates force and movement in response to electricity. A polysulfonated covalent organic framework (pS-COF) made by combining organic molecules on the surface of the artificial muscle electrode was used to generate an impressive amount of force relative to its weight with ultra-low power (~0.01V).

As a result, the artificial muscle, which was manufactured to be as thin as a hair with a thickness of 180 µm, produced a force more than 34 times greater than its light weight of 10 mg to initiate smooth movement. Through this, the research team was able to precisely control the direction of fluid flow with low power.

Professor IlKwon Oh, who led this research, said, “The electrochemical soft fluidic switch that operate at ultra-low power can open up many possibilities in the fields of soft robots, soft electronics, and microfluidics based on fluid control.” He added, “From smart fibers to biomedical devices, this technology has the potential to be immediately put to use in a variety of industrial settings as it can be easily applied to ultra-small electronic systems in our daily lives.”

A Three-dimensional Comparison of Pre- and Post-component Position in a Series of Off-label Robotic-assisted Revision Total Knee Arthroplasties

by Micah Macaskill, Richard Peluso, Jonathan Lash, Timothy E. Hewett, Matthew Bullock, Alexander Caughran in ArthroplastyToday

An innovative study at Marshall University explores the use of robotic-assisted joint replacement in revision knee scenarios, comparing the pre- and post-revision implant positions in a series of revision total knee arthroplasties (TKA) using a state-of-the-art robotic arm system.

In this retrospective study, the orthopaedic team at the Marshall University Joan C. Edwards School of Medicine and Marshall Health performed 25 revision knee replacements with a robotic assisted computer system. The procedure involved placing new implants at the end of the thighbone and top of the shinbone with the computer’s aid to ensure the knee was stable and balanced throughout the range of motion. Researchers then carefully compared the initial positions of the primary implants with the final planned positions of the robotic revision implants for each patient, assessing the differences in millimeters and degrees.

The analysis found that exceedingly small changes in implant position significantly influence the function of the knee replacement. Robotic assistance during revision surgery has the potential to measure these slight differences. In addition, the computer system can help the surgeon predict what size implant to use as well as help to balance the knee for stability.

Robotic implant planning page depicting the final position of the robotic revision implant (RRI) in green overlying the Primary implants (PI) in white.

“Robotic-assisted surgery has the potential to change the way surgeons think about revision knee replacement,” said Matthew Bullock, D.O., associate professor of orthopaedic surgery and co-author on the study. “The precision offered by robotic-assisted surgery not only enhances the surgical process but also holds promise for improved patient outcomes. Besides infection, knee replacements usually fail because they become loose from the bone or because they are unbalanced leading to pain and instability. When this happens patients can have difficulty with activities of daily living such as walking long distances or negotiating stairs.”

The study underscores the importance of aligning the prosthesis during revision surgery. The research also suggests potential advantages, including appropriately sized implants that can impact the ligament tension which is crucial for functional knee revisions.

“These findings open new doors in the realm of revision knee arthroplasty,” said Alexander Caughran, M.D., assistant professor of orthopaedic surgery and co-author on the study. “We continue to collect more data for future studies on patient outcomes after robotic revision knee replacement. We anticipate that further research and technological advancements in the realm of artificial intelligence will continue to shape the landscape of orthopaedic surgery.”

Unsupervised and supervised AI on molecular dynamics simulations reveals complex characteristics of HLA-A2-peptide immunogenicity

by Jeffrey K Weber, Joseph A Morrone, Seung-gu Kang, Leili Zhang, Lijun Lang, Diego Chowell, Chirag Krishna, Tien Huynh, Prerana Parthasarathy, Binquan Luan, Tyler J Alban, Wendy D Cornell, Timothy A Chan in Briefings in Bioinformatics

Researchers from Cleveland Clinic and IBM have published a strategy for identifying new targets for immunotherapy through artificial intelligence (AI). This is the first peer-reviewed publication from the two organizations’ Discovery Accelerator partnership, designed to advance research in healthcare and life sciences.

The team worked together to develop supervised and unsupervised AI to reveal the molecular characteristics of peptide antigens, small pieces of protein molecules immune cells use to recognize threats. Project members came from diverse groups led by Cleveland Clinic’s Timothy Chan, M.D., Ph.D., as well as IBM’s Jeff Weber, Ph.D., Senior Research Scientist, and Wendy Cornell, Ph.D., Manager and Strategy Lead for Healthcare and Life Sciences Accelerated Discovery .

“In the past, all our data on cancer antigen targets came from trial and error,” says Dr. Chan, chair of Cleveland Clinic’s Center for Immunotherapy and Precision Immuno-Oncology and Sheikha Fatima Bint Mubarak Endowed Chair in Immunotherapy and Precision Immuno-Oncology. “Partnering with IBM allows us to push the boundaries of artificial intelligence and health sciences research to change the way we develop and evaluate targets for cancer therapy.”

Slowest relaxation timescales estimated by unsupervised AI highlight concerted MHC–peptide dynamics and multiple peptide presentation modes.

For decades, scientists have been researching how to better identify antigens and use them to attack cancer cells or cells infected with viruses. This task has proved challenging because antigen peptides interact with immune cells based on specific features on the surface of the cells, a process which is still not well understood. Research has been limited by the sheer number of variables that affect how immune systems recognize these targets. Identifying these variables is difficult and time intensive with regular computing, so current models are limited and at times inaccurate.

The study found that AI models that account for changes in molecular shape over time can accurately depict how immune systems recognize a target antigen. Through these models, researchers could home in on what processes are critical to target with immunotherapy treatments such as vaccines and engineered immune cells. Researchers can incorporate these insights into other AI models moving forward to identify more effective immunotherapy targets.

“These discoveries are an example of what makes this partnership successful — combining IBM’s cutting-edge computational resources with Cleveland Clinic’s medical expertise,” Dr. Weber says. “These findings resulted from a key collaboration between everyone from a world-class expert in cancer immunotherapy to our physics-based simulation and AI experts. Collaboration when combined with innovation has terrific potential.”

Capturing complex hand movements and object interactions using machine learning-powered stretchable smart textile gloves

by Arvin Tashakori, Zenan Jiang, Amir Servati, Saeid Soltanian, Harishkumar Narayana, Katherine Le, Caroline Nakayama, Chieh-ling Yang, Z. Jane Wang, Janice J. Eng, Peyman Servati in Nature Machine Intelligence

This month, a group of stroke survivors in B.C. will test a new technology designed to aid their recovery, and ultimately restore use of their limbs and hands. Participants will wear a new groundbreaking “smart glove” capable of tracking their hand and finger movements during rehabilitation exercises supervised by Dr. Janice Eng, a leading stroke rehabilitation specialist and professor of medicine at UBC.

The glove incorporates a sophisticated network of highly sensitive sensor yarns and pressure sensors that are woven into a comfortable stretchy fabric, enabling it to track, capture and wirelessly transmit even the smallest hand and finger movements.

“With this glove, we can monitor patients’ hand and finger movements without the need for cameras. We can then analyze and fine-tune their exercise programs for the best possible results, even remotely,” says Dr. Eng.

UBC electrical and computer engineering professor Dr. Peyman Servati, PhD student Arvin Tashakori and their team at their startup, Texavie, created the smart glove for collaboration on the stroke project. Dr. Servati highlighted a number of breakthroughs.

“This is the most accurate glove we know of that can track hand and finger movement and grasping force without requiring motion-capture cameras. Thanks to machine learning models we developed, the glove can accurately determine the angles of all finger joints and the wrist as they move. The technology is highly precise and fast, capable of detecting small stretches and pressures and predicting movement with at least 99-per-cent accuracy — matching the performance of costly motion-capture cameras.”

Fabrication and characteristics of HSYs.

Unlike other products in the market, the glove is wireless and comfortable, and can be easily washed after removing the battery. Dr. Servati and his team have developed advanced methods to manufacture the smart gloves and related apparel at a relatively low cost locally.

Dr. Servati envisions a seamless transition of the glove into the consumer market with ongoing improvements, in collaboration with different industrial partners. The team also sees potential applications in virtual reality and augmented reality, animation and robotics.

“Imagine being able to accurately capture hand movements and interactions with objects and have it automatically display on a screen. There are endless applications. You can type text without needing a physical keyboard, control a robot, or translate American Sign Language into written speech in real time, providing easier communication for individuals who are deaf or hard of hearing.”

Automatic recording of rare behaviors of wild animals using video bio-loggers with on-board light-weight outlier detector

by Kei Tanigaki, Ryoma Otsuka, Aiyi Li, Yota Hatano, Yuanzhou Wei, Shiho Koyama, Ken Yoda, Takuya Maekawa in PNAS Nexus

Have you ever wondered what wildlife animals do all day? Documentaries offer a glimpse into their lives, but animals under the watchful eye do not do anything interesting. The true essence of their behaviors remains elusive. Now, researchers from Japan have developed a camera that allows us to capture these behaviors.

In a study, researchers from Osaka University have created a small sensor-based data logger (called a bio-logger) that automatically detects and records video of infrequent behaviors in wild seabirds without supervision by researchers.

Infrequent behaviors, such as diving into the water for food, can lead to new insights or even new directions in research. But observing enough of these behaviors to infer any results is difficult, especially when these behaviors take place in an environment that is not hospitable to humans, such as the open ocean. As a result, the detailed behaviors of these animals remain largely unknown.

“Video cameras attached to the animal are an excellent way to observe behavior,” says Kei Tanigaki, lead author of the study. However, video cameras are very power hungry, and this leads to a trade-off. “Either the video only records until the battery runs out, in which case you might miss the rare behavior, or you use a larger, heavier battery, which is not suitable for the animal.”

To avoid having to make this choice for the wild seabirds under study, the team use low-power sensors, such as accelerometers, to determine when an unusual behavior is taking place. The camera is then turned on, the behavior is recorded, and the camera powers off until the next time. This bio-logger is the first to use artificial intelligence to do this task.

“We use a method called an isolation forest,” says Takuya Maekawa, senior author. “This method detects outlier events well, but like many other artificial intelligence algorithms, it is computationally complex. This means, like the video cameras, it is power hungry.” For the bio-loggers, the researchers needed a light-weight algorithm, so they trained the original isolation forest on their data and then used it as a “teacher” to train a smaller “student” outlier detector installed on the bio-logger.

Example use of our bio-loggers for rare event detection and recording. A) A bio-logger was used in this study.

The final bio-logger is 23 g, which is less than 5% of the body weight of the Streaked Shearwater birds under study. Eighteen bio-loggers were deployed, a total of 205 hours of low-power sensor data were collected, and 76 5-min videos were collected. The researchers were able to collect enough data to reveal novel aspects of head-shaking and foraging behaviors of the birds.

This approach, which overcomes the battery-life limitation of most bio-loggers, will help us understand the behaviors of wildlife that venture into human-inhabited areas. It will also enable animals in extreme environments inaccessible to humans to be observed. This means that many other rare behaviors — from sweet-potato washing by Japanese monkeys to penguins feeding on jellyfish — can now be studied in the future.

Harnessing synthetic active particles for physical reservoir computing

by Xiangzun Wang, Frank Cichos in Nature Communications

Artificial intelligence using neural networks performs calculations digitally with the help of microelectronic chips. Physicists at Leipzig University have now created a type of neural network that works not with electricity but with so-called active colloidal particles. In their publication, the researchers describe how these microparticles can be used as a physical system for artificial intelligence and the prediction of time series.

“Our neural network belongs to the field of physical reservoir computing, which uses the dynamics of physical processes, such as water surfaces, bacteria or octopus tentacle models, to make calculations,” says Professor Frank Cichos, whose research group developed the network with the support of ScaDS.AI. As one of five new AI centres in Germany, since 2019 the research centre with sites in Leipzig and Dresden has been funded as part of the German government’s AI Strategy and supported by the Federal Ministry of Education and Research and the Free State of Saxony.

“In our realization, we use synthetic self-propelled particles that are only a few micrometres in size,” explains Cichos. “We show that these can be used for calculations and at the same time present a method that suppresses the influence of disruptive effects, such as noise, in the movement of the colloidal particles.” Colloidal particles are particles that are finely dispersed in their dispersion medium (solid, gas or liquid).

Experimental realization.

For their experiments, the physicists developed tiny units made of plastic and gold nanoparticles, in which one particle rotates around another, driven by a laser. These units have certain physical properties that make them interesting for reservoir computing.

“Each of these units can process information, and many units make up the so-called reservoir. We change the rotational motion of the particles in the reservoir using an input signal. The resulting rotation contains the outcome of a calculation,” explains Dr Xiangzun Wang. “Like many neural networks, the system needs to be trained to perform a particular calculation.”

The researchers were particularly interested in noise. “Because our system contains extremely small particles in water, the reservoir is subject to strong noise, similar to the noise that all molecules in a brain are subject to,” says Professor Cichos. “This noise, Brownian motion, severely disrupts the functioning of the reservoir computer and usually requires a very large reservoir to remedy. In our work, we have found that using past states of the reservoir can improve computer performance, allowing smaller reservoirs to be used for certain computations under noisy conditions.”

Cichos adds that this has not only contributed to the field of information processing with active matter, but has also yielded a method that can optimise reservoir computation by reducing noise.

Subscribe to Paradigm!

Medium. Twitter. Telegram. Telegram Chat. Reddit. LinkedIn.

Main sources

Research articles

Science Robotics

Science Daily

IEEE Spectrum

--

--