Controlling a Reachy robot in Unity

Gaelle Lannuzel
pollen-robotics
Published in
5 min readJun 28, 2022
Simulated Reachy in an Unity scene throwing a ball in a basket

Introduction

At Pollen Robotics, getting more and more people involved into interactive robotics was always our main motive. We strongly believe that it’s by making robotics accessible that new cool apps will see the light of day!

That’s why we created our humanoid robot Reachy a couple of years back. We designed it as an open-source platform, so researchers or R&D teams can prototype their custom real world applications with it. To make it even more simple, we added a tele-operation app!

Woman teleoperating Reachy robot using an Oculus Quest 2 VR headset
Reachy can be teleoperated with VR!

Since then, Reachy has been sold all around the world and has been used as an arm prosthetics, to distribute Covid19 masks or help students learn AI. It’s amazing to see what applications Reachy users are inventing. And we keep discovering new ones every day!

Reachy playing tic-tac-toe, used as a AI platform in a university
Reachy playing tic-tac-toe, used as an AI platform in a university (Howest)
Reachy’s right arm is teleoperated with the residual movements of the shoulder from an amputee of the right arm.
Illustration of an amputee participant controlling Reachy arm with his stump movements at the Ugecam Tours de Gassies during an experiment part of the Effie Ségas PhD and lead by the INCIA Hybrid Team directed by Aymar de Rugy

But let’s face it… Humanoid robots are complex beasts…

As we said… Humanoid robots are complex!

Working on a real robot requires lots of skills: software, mechanics, electronics, … On top of that, humanoid robots are still pretty expensive and quite fragile. They are thus only accessible to a tiny fraction of the passionate roboticists out there. Don’t worry, we will keep working on finding a way to make them cheap and easy to use. But let’s be honest, it may take a while...

So in the meantime, a great option to let everyone try all of their ideas is to use simulation!

Reachy in Unity

To make our simulator really accessible, we have decided to rely on a powerful and widely used tool: Unity.

Our simulator simply consists of a Unity package that you can import in your own Unity scenes. You can then control your simulated Reachy like you would do with the real one! Move the arms, turn the head, get the cameras’ images and even control the antennas! To do that, you can use our Python SDK.

Reachy robot is moving its head and waving its left hand to say hello
Reachy moving in a scene made with Reachy2021 Unity package

Ever wanted to be a robot? The simulated robot is also compatible with our VR application!

Get started

But enough talking, let’s try to make our first app with Reachy in Unity!

To get started, you need Unity of course (versions above 2020.1 are fine), and to download a few other elements:

  1. Download the latest release of reachy2021-simulator.unitypackage from our Github
  2. Dowload the grpc_unity_package from the gRPC daily builds

Then create your project using Reachy 2021 simulator:

Unity project
Here is a simple project with the necessary elements to use Reachy in Unity
  1. Create a new 3D Unity project (or open an existing one)
  2. Extract all from the previously downloaded grpc_unity_package, and paste the Plugins folder directly in the Assets folder of your Unity project.
  3. From the menu Assets/Import Package/Custom Package…, import the reachy2021-simulator.unitypackage you previously downloaded.
  4. Drag and drop Reachy and the Server from the Prefabs folder into your scene.
  5. Move the Main Camera in the scene so that it faces Reachy.
  6. Then click Play to get started!

You have several ways to control the simulated robot:

  • Using our Python SDK
  • Using the VR teleoperation app
  • Creating your own gRPC client (in any language you may prefer)

Check how it moves with our Python SDK

Let’s connect to the robot in your new scene using our Python SDK. To do so, you need to use your favorite Python3.7+ environment and to install reachy-sdk.

pip install reachy-sdk

Then you are ready to go! Create this welcome movement by creating simple functions.

Reachy robot waving its hand to say hello in Unity

We begin by importing the required modules: ReachySDK for the connection to the robot, goto to generate trajectories, and InterpolationMode to choose the pattern of these trajectories. Of course we import as well time to monitor the temporality of our movements.

First we want to create a function that defines simple movements with the antennas, so that Reachy will shake them to welcome you:

Then we write a hello_left_arm function that makes Reachy move his left arm and wave his wrist to do a sign:

It would be cool to add some head movements at the beginning as well: we can use either look_at or goto to generate head movements.

We are finally ready to generate the whole movement!
Let’s say hello with Reachy:

Don’t forget to click Play on Unity before launching the script, otherwise your Python script won’t find a server to connect to.

Find the whole helloworld script on Github.

Test our scene!

Ready to go even further?

Clone our repository, go to our office scene and try playing basketball with the office kit. You will discover even more features of Reachy!

Will you manage to throw the ball in the basket?

Reachy robot throws a ball in a basket in a Unity scene based on the package
Grab the ball and throw it: can you aim for the basket?

Find out the basketball script to do so on Github!

Now that you have everything needed to use your own simulated Reachy, you can start working on a custom scene or even on your own app!

You can join our discord and share your work with our community. We’d love to hear about what you can come up with!

--

--