Controlling a KUKA youBot

Eve Pardi
5 min readJan 19, 2024

--

Introduction

Robotics is a field of science that involves design, control and operation of machines that perform tasks efficiently and safely. An expandable and modular robotic system, the KUKA youBot is used widely in education and research today in several areas, such as rescuing people from collapsed buildings, improving the performance of industrial robots, assembling of furniture, or to monitor deformations of bridge structures. The solution discussed in this paper aims to teach newcomers the kinematics of a robot, by controlling movements of a KUKA youBot with the assistance of a graphical user interface (GUI).

Preparing the simulation

The KUKA youBot is added to the simulator environment in CoppeliaSim. It is a mobile, terrestrial robot which has four Swedish wheels installed on its base, which are designed to allow the robot to rotate, move in lateral, and forward directions, so the degrees of freedom (DoF) is three. These robots have complex mechanics and controls, making them more stable, manoeuvrable, and controllable. The robotic arm builds up of four joints and has five DoF, the endeffector is a gripper with two joints.

The KUKA youBot in CoppeliaSim

A perspective vision sensor is added to the base of the robot, set up with Persistence of Vision Ray Tracer (POV-Ray). A Floating view was also added so the sensor output is visible in real-time.

Coverage of the POV-Ray:

  • Near/far clipping plane (m): 0.01/30
  • Perspective angle/Orthogonal size (m): 50
Sensor settings (CoppeliaSim)
Added a Vision sensor and two cubics for better visibility

Find the scene file for CoppeliaSim on GitHub.

Implementation

Connection between CoppeliaSim and MATLAB

A CoppeliaSim scene is created, and added the following line of code to its child script:

simRemoteApi.start(19000)

A controller is created with the use of GUIDE in MATLAB, which can be used to remotely control the robot.

The controller holds a button for validation connection to the simulation, a button to take a picture with the vision sensor, four sliders to manipulate the movement of the robotic arm, and a joystick to control the movement of the base. Each slider is configured with minimum and maximum values, based on the angles the arm can rotate. It is also able to display images taken by the vision sensor. Find the figure above on GitHub which to be added to the MATLAB working folder.

The controller.m file should also be placed in the MATLAB working folder. The script holds two functions for each asset of the control panel. One function is executed when creating the object, such as the background color, the other one is called if an action is performed, such as moving the slider, or pressing a button. The code in the controller.m file sets up connection between the controller and the client. When the button is pressed, the connection is tested and verified. A message is displayed when the connection succeeds or fails:

sim=remApi('remoteApi');
sim.simxFinish(-1);
clientID=sim.simxStart('127.0.0.1',19999,true,true,5000,5);

if (clientID > -1) disp('Connected to remote API server');
f = msgbox("The connection to the remote API succeeded.","Success");
else disp('Failed connecting to remote API server');
f = msgbox("Failed to connect to the remote API.","Error","error");
end

Take pictures with the Vision sensor

A button is responsible for capturing an RGB image with the Vision Sensor. If successful, a .jpg file is saved in the working directory.

[returnCode1,sensor] = sim.simxGetObjectHandle(clientID, 'Cam', sim.simx_opmode_blocking);
[returnCode2,resolution,Image] = sim.simxGetVisionSensorImage2(clientID, sensor, 2, sim.simx_opmode_oneshot_wait);

imwrite(Image, 'img.jpg');
disp('An image is captured and saved to img.jpg.');

im1 = imread('img.jpg');
im1 = im2double(im1);

axes(handles.axes1);
imshow(im1);

If a greyscale image should be saved, the option of simxGetVisionSensorImage2() function should be changed to 1 from 2. After taking the picture, the image is displayed on the controller.

Moving the arms

While moving the slider in the controller, the value is collected.

Each slider is defined as having the minimum and the maximum angles the arm joints can move. This information is based on the following image:

youBot Arm Dimensions (Source: (YouBot Detailed Specifications — youBot wiki, no date))

Based on this information the following setup for the joints can be added with the use of MATLAB UIControl :

The joint handle is retrieved from the simulator, then with the use of the simxSetJointTargetPosition() function, the arm is moved by the angle value of the slider position.

slider_pos5 = int64(get(handles.slider6, 'Value'));
[r, j4] = sim.simxGetObjectHandle(clientID, 'youBotArmJoint4', sim.simx_opmode_blocking);
sim.simxSetJointTargetPosition(clientID,j4,slider_pos5,sim.simx_ opmode_streaming);

Moving the base

The base is moved by a “joystick”, which includes buttons to move forward, backward, to the left, to the right and to rotate. There is another button to stop the robot at the point where it is when pressing the button. In each case, the wheel joint handles are collected from the simulator, and then with the use of the simxSetJointTargetVelocity() function, each wheels are moved.

The definition of the velocity is based on the following:

Movement definition for the wheel joints

Here is an example code for moving the robot’s base to the right:

[r1,w1] = sim.simxGetObjectHandle(clientID, 'rollingJoint_fl', sim.simx_opmode_blocking); 
[r2,w2] = sim.simxGetObjectHandle(clientID, 'rollingJoint_rl', sim.simx_opmode_blocking);
[r3,w3] = sim.simxGetObjectHandle(clientID, 'rollingJoint_rr', sim.simx_opmode_blocking);
[r4,w4] = sim.simxGetObjectHandle(clientID, 'rollingJoint_fr', sim.simx_opmode_blocking);

sim.simxSetJointTargetVelocity(clientID,w1,20,sim.simx_opmode_streaming);
sim.simxSetJointTargetVelocity(clientID,w2,-20,sim.simx_opmode_streaming);
sim.simxSetJointTargetVelocity(clientID,w3,20,sim.simx_opmode_streaming);
sim.simxSetJointTargetVelocity(clientID,w4,-20,sim.simx_opmode_streaming);

Conclusion

This solution covers how to create a remote-control GUI to manipulate the movement of the arm and base of a KUKA youBot. As improvement, the robot should have the ability to stop if getting too close to an obstacle, which could be supported by installing more capable sensors.

Solution recording

Solution on GitHub: https://github.com/ExOblivione/robotics2023

--

--

Eve Pardi

Microsoft AI MVP | Senior Artificial Intelligence Consultant @ Avanade