YOLO Integration

Vignesh Gopalakrishnan
3 min readFeb 11, 2024

Robotics and Autonomous Systems(AI)

Image source: https://shop.elephantrobotics.com/products/artificial-intelligence-kit-2023

Overview

  • Abstract
  • Introduction
  • Methodology
    - System Setup
    - Object Detection
    - Robotic Arm Control
    - Workflow
    - Implementation
    - Challenges and Learnings
  • Results
  • Future Improvements
  • Conclusion

Abstract:

This project demonstrates the integration of a robotic arm equipped with a camera for real-time object detection and sorting. The system utilizes the YOLO (You Only Look Once) model for efficient and accurate object detection. The goal is to identify various objects, such as a basketball, apple, etc., using the YOLO model and employ a robotic arm to pick them up and place them into separate bins based on first letter of the object.

Introduction:

The integration of computer vision and robotics has witnessed transformative advancements, particularly in real-time object recognition and manipulation. This project combines the prowess of the You Only Look Once (YOLO) object detection model with the MyCobot 280 robotic arm, showcasing the potential for seamless object sorting.

The primary objective is to demonstrate the efficiency of YOLO, an advanced object detection algorithm, when coupled with a MyCobot 280 robotic arm. The system identifies specific objects like basketballs and apples, utilizing YOLO’s real-time detection capabilities. Subsequently, the robotic arm autonomously picks up and places these objects into predefined bins based on the first letter of the object.

Methodology:

System Setup:

· The robotic arm, controlled by a MyCobot 280, is equipped with a camera for real-time object detection.

· YOLO (You Only Look Once) is used as the object detection model, implemented in Python with OpenCV and PyMyCobot libraries.

Object Detection:

· YOLO is employed to detect objects in real-time.

· The system is trained to recognize specific classes of objects, such as a basketball, apple, etc.

Robotic Arm Control:

· The MyCobot 280 robotic arm is programmed to pick up and place objects based on their detected classes.

· Specific angles and coordinates are defined for the robotic arm to execute precise movements.

Workflow:

· The system captures images from the camera and processes them through the YOLO model.

· The detected class of each object determines the corresponding bin for placement by the robotic arm. Here once object is identified , it will take first letter of the object and place to separate bin based on the first letter.

Implementation:

· The YOLO model is integrated into the system, allowing real-time object detection with high accuracy.

· The MyCobot 280 robotic arm is controlled to execute predefined movements for grasping and placing objects.

Challenges and Learnings:

· Calibration of the camera and coordination between the YOLO model and robotic arm posed challenges.

· Understanding and fine-tuning the YOLO model for specific object classes was a key learning experience.

Results:

The system successfully detects and sorts various objects, including a basketball, apple, etc., based on their first letter of the classes, using the YOLO model and the MyCobot 280 robotic arm.

Future Improvements:

· Further refinement of the YOLO model for improved object recognition.

· Integration of additional sensors for enhanced object detection capabilities.

Conclusion:

This project showcases the effective integration of computer vision and robotics for automated object sorting. The combination of YOLO-based object detection and precise control of the MyCobot 280 robotic arm results in an efficient and reliable sorting system for diverse objects.

--

--

Vignesh Gopalakrishnan

Ms in Robotics and Autonomous Systems(AI)| ASU | Ex-Data Scientist @Crayon Data | Recommendation Engine | NLP