How I spent a one year robotics research program in Japan

Mohammed Tahri Sqalli
9 min readJan 1, 2017

--

I have spent a one year research program in the field of robotics in Japan. Throughout this year, I have worked on two robot projects, built an obstacle avoidance algorithm for a tele-presence robot, published two academic papers, and presented one in an international conference. Here is how I did it.

How it all began…

Japan has been the childhood dream for almost every Moroccan child who spent his very long summer afternoons watching Japanese Anime translated into Arabic. Its wired and exiting innovations especially in the field of Robotics are the main reasons why when the opportunity presented itself, I hopped on a 22 hours long flight to discover for the first time the land of the rising sun.

The research culture

In Japan, academic research is considered a crucial part of the academic curriculum. Whether you are a Bachelor student or a Master student, you need to join a research laboratory.

A research laboratory is usually composed of a professor supervising about 20 students in his field of research. If you are a Bachelor student, your final Senior year is your research year. If you are a Master student, during the two years in which you are pursuing your Master, you are supposed to join a research laboratory in addition to your classes. There are some of my lab mates who spent three years in the same lab, (1 Bachelor year and 2 Master years) which gives them a consistent amount of time to specialize in the research they are conducting.

In the lab all the research equipment and resources is available and is open 24/7 for anyone willing to work.

Tatsuno Sensei’s Research Team 2016
Research Environment

Mohammed’s case

I came to Dr. Tatsuno’s Laboratory at Meijo University in Nagoya, Japan, with a grant from the Japanese National Government not knowing in which level of research category (Bachelor level or Master level) I will be put in, especially that my status in my university (Al Akhawayn University) is a combined bachelor/master degree in computer science and software engineering, so I took a gap year in my university so as to deal with how this year will be counted towards my degree after the end of my research program.

Professor Tatsuno has been flexible and kind enough to take my case into consideration and designed the appropriate research plan for me. I was in charge of designing a new obstacle avoidnace algorithm for the already developped tele-presence robot. A telepresence robot is a robot that mimics the presence of a human being in a place where it is impossible for the human to be physically in at that specific moment. As I showed interest in the field and wanted to know more, I started to explore the other projects other than mine that Pr. Tatsuno was supervising.

From Computer Science to Software Engineering to Robotics

I have spent the past four to five years searching for opportunities, evaluating them and grabbing the best ones.

I have started with a Bachelor of Science in Computer Science, in its middle I went for a semester abroad to Binghamton University. When I came back, I realized that Computer Science is probably the next big thing, so I went for a Combined Bachelor/ Master in Software Engineering.

Before starting the Master classes, the opportunity for a robotics research year came, so I did not hesitate a second. Although I knew that the robotics field is very divergent from Software Development, I had to do my usual habit of adapting to the opportunity and learning another field for the sake of science.

Getting to learn to “drive a robot”

I came to Japan with no prior background of robotics, especially that the field is not existent in Morocco. All I knew are some demos I saw on some Hackathons in the US of some tele-presence robots.

Getting to learn to “drive a robot” is just like getting to learn to drive a car especially that both of them share many movement characteristics.

I had to depict the components of the robot, understand the basic circuitry, and more importantly the cultural reasons behind the project.

Hajimemashite Meijo Kun!

I got to meet and greet Meijo Kun, our telepresence robot, the very first day of my research, So please Say “Hajimemashite” (Nice to meet you) to Meijo Kun.

Meijo Kun, The telepresence robot I worked on during my research year.

Meijo Kun lives in an elderly care facility in a city called Shimane. He cares about the Japanese elderly and is always present for them. One can ask Meijo Kun for help to visit a loved on in an elderly care facility.

Meijo Kun is about 1 meter and some centimeters high, he talks, greets, sees, and walks.

Meijo Kun has four vital organs without them it could not perform its tasks which are:

  • Task planner
  • Human Robot Interface
  • Vehicle Controller
  • Vision

This year (2016), I got to be Meijo Kun’s new vision doctor so I got him some new sight glasses, and thought him a trick or two for how to walk.

Meijo Kun’s Eyesight problem

Before being Meijo Kun’s new eyesight doctor, I had to analyze what Meijo Kun used in order to see its environment. Meijo Kun uses a Laser Range-Finder (LRF) to see its environment (A little bit above the wheels in the figure above).

An LRF is a distance sensor heavily used in Robotics, it uses the Laser technology to measure how an object is far from the Laser Range-Finder sensor. With that data, Meijo Kun can know if an obstacle is near or not.

The LRF is very restricted into its abilities, as it scans the environment only in 2D. The LRF is located around 20 to 30 cm above the ground as seen in the previous picture. This means that the obstacles either above that height (like for example desks) or below that height (like for example stairs going down) or a little bit ahead of the range of sight cannot be seen, which will end up in a collision or a fall for the tele-presence robot.

Getting Meijo Kun some new spectacles

These are the new spectacles that Meijo Kun will be using:

Specifications and description of the Kinect used as the main vision hardware.

The Kinect is a multi-purpose sensor camera. It is known as an RGB-D, with RGB standing for the high resolution 1080p color camera, and the D stands for the Depth camera as well. With the depth sensor inside the Kinect, it is easy to capture not only a normal RGB image, but capturing a depth 3D model is possible as well.

The Kinect is the main hardware in the new obstacle avoidance algorithm.

Teaching Meijo Kun how to walk again

Simplified Simultaneous Localization And Mapping (SLAM) Algorithm

The above figure shows the training that Meijo Kun got in order to learn how to walk effectively again. The above algorithm is a tweaked version of a gneral Simultaneous Localization and Mapping (SLAM) algorithm.

This algorithm is one of the very used algorithms for moving robots. As its name might imply, using the algorithm, the robot needs to build a map of its environment it is located in, but at the same time locate itself in the map it already created or it is in the process of creating.

This problem (The SLAM problem) is known as a chicken or egg problem, because in order for the robot to build a map from scratch, it needs to know its location inside that map, and in order to locate itself in the map, it needs to have the map first.

The Simultaneous Localization and Mapping is based on two essential parts:

  • Assembling many 3D images/ point clouds so as to create a very big and extensive map of the environment where the robot is located.
  • The use of odometery by the robot to locate itself inside the map that it already created. (An Odometer is the same device used by a car to tell the user the real time speed)

Some Results…

These are some results of the experiments that we conducted to test the effectiveness of our algorithm.

We captured our very first 3D image/Point Cloud using Meijo Kun, and here is how it looks. The image shows a portion of our laboratory.

First Captured 3D image

The 3D point cloud was a little bit complicated to handle as the environment was also complicated to understand and analyze, so we suggested a simpler environment and a simpler map.

We suggested a 2D map of our laboratory hallway. Following are some images of the result of our second experiment. It is important to note that what is represented in the 2D map in white is considered an obstacle for the robot.

Using some filtering, (in order to not consider the floor as an obstacle seen in the images below) we got a simple 2D map of our hallway.

We also succeeded in locating the robot inside the map (+checkmark in the 2D map is the current location of our robot)

2D Map of Laboratory Hallway (Left: Simple image for comparison, Middle: 2D Map Before Filtering, Right: 2D Map after Filtering)

After that we tried to reproduce the same experiment but with a 3D map. Following is the result of our experiment.

3D Map of Laboratory Hallway

The coloring in the figure is for the purpose of showing the difference between the first 3D capture and the second 3D capture. The scales shows that between the orange and the green point cloud, there was a movement of approximately 1 meter.

Using these experiments as a basis, we succeeded in creating some small maps of some environment, like our laboratory, the hallway, and the garden near the university building. Meijo Kun also succeeded to locate itself in them.

Oh, and one more thing…!

As I was working on the Meijo Kun Project, I had the privilege to work with my lab mates on the power distribution line maintenance robot. This robot has as a goal to automatize the process of maintaining public electricity power lines. I worked on the vision component of the robot as well, as it is a little bit similar to the vision system of the Meijo Kun robot.

By the end of my research year, we succeeded in making the robot recognize objects, and grab them autonomously. Following is a small demonstration of the power distribution line maintenance robot recognizing and garbing a bolt autonomously.

The hard work paid off!

The whole team working on both projects got to publish papers and present the work into two conferences:

I hope this post encourages anyone hoping to learn something new, or is undecided about exploring a new research field or learning opportunity to start and grasp the opportunity. This year has been truly fruitful from the personal and academic side.

Have any questions?

Feel free to reach out through Twitter (@tahrisqalli) or shoot me an email.

--

--