Python project for visual servoing of a 6-DOF robotic arm (xArm 6). An external camera is used to identify two QR codes, one with the word ‘robot’ encoded corresponding to the arm’s gripper position, and other with the word ‘target’ encoded, used to identify the target object’s position.
Relative position of these points of interest is determined and a trajectory is computed and excecuted to make the arm’s gripper reach the desired position.
This projects runs in ROS melodic. The xArm developer demo packages for ROS simulation were used (1), and MoveIt motion planning framework (2,3) must be installed in the workspace for it to run properly.

computer vision, visual servoing, xarm, ros, gazebo, python