Using AR to Control a Six-Axis Robot Arm
Principal Investigator: Professor Billo
AWaRE REU Researcher: Daniel Riehm, University of Notre Dame
Project Description: Augmented Reality (AR) headsets offer a suite of hardware and software tools conducive to real-time, user-friendly manipulation of virtual 3D objects located in persistent locations around the user. This research sought to apply these tools to the programming and control of a six-axis universal robotic arm, providing the ability to either control the arm in real-time or to pre-program a sequence of steps to be performed on command. Using AR to provide input allows users to control the arm with a reasonable degree of precision without a need for extensive technical training. The application utilizes the orientation of the user’s head along with certain recognized hand gestures to allow selection, translation, and rotation of virtual objects. The user controls the robotic arm by manipulating a 3D “cursor” object, which represents the desired location and orientation of the end of the robotic arm. This information is translated into UR Script commands and sent to the robot via a network connection, either as the user moves the cursor or in recorded batches.
Finding: The application sends commands to and reads data from the robot over a local network TCP connection. Using a simple calibration process, the user can align the coordinate axes of the headset with those of the robot, allowing the user to specify exact poses in the space around them and have the arm reach those poses with precision.