Skip to content

This is a comprehensive project aimed at developing computer vision algorithms integrated with the kinematics of a 5-DOF manipulator. The project focuses on detecting and manipulating blocks within the reach of the robot arm.

Notifications You must be signed in to change notification settings

spsingh37/KinoVision

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🤖 KinoVision: Vision-Guided Block Handling

  • Note: This group project was completed as part of the ROB 550: Robotic Systems Lab course between January and March 2024. The original code is hosted on GitLab. This GitHub version is built to demonstrate the results and allow users to smoothly follow the steps.
  • PDF: Armlab report

🎯 Goal

Develop computer vision algorithms and work with the kinematics of a 5-DOF manipulator to detect and manipulate blocks on a board within the reach of the robot arm.

⚙️ Prerequisites

🖥️ Hardware

💻 Software

  • ROS2 Humble
  • Realsense2_camera: ROS driver node for the Realsense camera. It enables us to stream the camera output as a ROS message. You can find information on the Intel RealSense GitHub
  • interbotix_sdk: ROS driver node for the Interbotix Robot Arm. This node allows us to subscribe to topics relating to the joint states, representing the angle of the joints read from the motors. You can find more information on the GitHub

🗂️ Code structure

  • install_scripts
    • install_scripts/config
      • rs_l515_launch.py - to launch the camera
      • tags_Standard41h12.yaml - to define the april tags used on the board
    • install_Dependencies.sh - to install ROS2/All the ROS wrappers/Dependencies
    • install_Interbotix.sh - to install arm related stuff
    • install_LaunchFiles.sh - to move the files under /config to where it should to be
  • launch - to store the launch files
  • src - where the actual code written
    • camera.py - Implements the Camera class for the RealSense camera.
      • Functions to capture and convert frames
      • Functions to load camera calibration data
      • Functions to find and perform 2D transforms
      • Functions to perform world-to-camera and camera-to-world transforms
      • Functions to detect blocks in the depth and RGB frames
    • control_station.py
      • This is the main program. It sets up the threads and callback functions. Takes flags for whether to use the product of exponentials (PoX) or Denabit-Hartenberg (DH) table for forward kinematics and an argument for the DH table or PoX configuration.
    • kinematics.py - Implements functions for forward and inverse kinematics
    • rxarm.py - Implements the RXArm class
      • Feedback from joints
      • Functions to command the joints
      • Functions to get feedback from joints
      • Functions to do FK and IK
      • A run function to update the dynamixiel servos
      • A function to read the RX200 arm config file
    • state_machine.py - Implements the StateMachine class
      • The state machine is the heart of the controller
  • config
    • rx200_dh.csv - Contains the DH table for the RX200 arm
    • rx200_pox.csv - Containes the S list and M matrix for the RX200 arm.

🛠️ Installation

  1. Clone this repository
git clone https://github.com/SuryaPratapSingh37/KinoVision.git
  1. Install all the dependencies and packages
cd KinoVision/install_scripts
./install_Dependencies.sh

Wait until it's complete before proceeding to the next step. 3. Install arm related stuff - Source

./install_Interbotix.sh

During the installation, you'll encounter prompts. For prompts related to AprilTag and MATLAB-ROS installation, type no and press Enter. Wait until it's complete before proceeding to the next step. 4. Move config files

./install_LaunchFiles.sh

This file is used to move the config files. The configurations are based on the AprilTag family we have and the specific camera model we use. 5. Install camera calibration package

  • Open a new terminal, then copy and run the following command
./install_Calibration.sh
  • (Optional) To verify environment setup, open a new terminal and run
printenv | grep ROS
  • The above should print all the environment variables associated to ROS
  1. Now reboot the computer
sudo reboot
  1. Testing the installation
  • Connect the camera USB and arm USB to the computer, then open a terminal
cd KinoVision/launch
chmod +x launch_armlab.sh
./launch_armlab.sh

The above starts 3 nodes: camera, apriltag, interbotix_arm

  • Open another terminal
cd KinoVision/launch
chmod +x launch_control_station.sh
./launch_control_station.sh

This one starts the control station GUI, and validation succesful setup of the package

🔄 Control station pipeline

📊 Results

📈 Teach-and-Repeat motion

  • Added two buttons to the Control station GUI
  • To perform the operation of swapping of two blocks, trajectory waypoints were recorded using 'Teach' button, while 'Repeat' button is used to play back the recorded waypoints. For path planning, a low-level PID-controller is used in the joint space.
GUI Video

📏 Camera calibration

🛠️ (Optional) Intrinsic camera calibration

  • This is Optional because the camera comes with an internal factory calibration. But there's always some possibility, if over time the camera has gone physical alterations or damages, then the intrinsic calibration parameters might have changed. For this you'll require a checkerboard
  • Start Realsense2 camera node
ros2 launch realsense2_camera rs_l515_launch.py
  • Start AprilTag Dectection node
ros2 run apriltag_ros apriltag_node --ros-args \
      -r image_rect:=/camera/color/image_raw \
      -r camera_info:=/camera/color/camera_info \
      --params-file `ros2 pkg prefix apriltag_ros`/share/apriltag_ros/cfg/tags_Standard41h12.yaml
  • Start the arm node
ros2 launch interbotix_xsarm_control xsarm_control.launch.py robot_model:=rx200

This command will launch rviz with the virtual robot model, the model would show exactly how the arm is moving.

  • Start the camera calibration node
cd ~/image_pipeline
source install/setup.bash
# the size of the checkerboard and the dimensions of its squares may vary
ros2 run camera_calibration cameracalibrator --size 6x8 --square 0.025 \
    --no-service-check --ros-args \
    -r image:=/camera/color/image_raw  \
    -p camera:=/camera

This will open a calibration window. Move the checkerboard in front of the camera in various positions and orientations until you see all the 4 bars (X, Y, Size, Skew) in green (the longer the bars the better calibration it would be) in the window. When satisfied, click on 'Calibrate' and 'Save'.

🔧 Automatic Extrinsic (& homography) camera calibration using GUI

  • Validating the robustness (by manually changing the camera orientation), also drawing the grid points post-calibration

🤖 Kinematics

📐 Forward kinematics (Product of exponentials approach)

🔄 Inverse kinematics (Analytical approach)

🧱 Block detection

🖱️ Click to Grab/Drop (via GUI)

📦 Automatic Sorting of the blocks (based on size)

🙏 Credits

This project would not have been successful without the invaluable guidance and support of:

  • Dr. Peter Gaskell
  • Dr. Greg Formosa
  • Abhishek Narula

Special thanks to my team members:

  • Guangyi Liu
  • Jayaprakash Harshavardhan

About

This is a comprehensive project aimed at developing computer vision algorithms integrated with the kinematics of a 5-DOF manipulator. The project focuses on detecting and manipulating blocks within the reach of the robot arm.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published