This package contains a file for launching the necessary files for real-time teleoperation of our UR3
The interaction is based on visual input. An RGB-D camera is used for the human monitoring and the Openpose is utilized for the human localization. The wrist pixels are mapped to 3D coordinates expressed in a static reference frame and are used as potential trajectory points for the control of the robot's end effector position using this repo.
The preprocessing stage of the 3D coordinates is based on the online_trajectory_process package.
The motion of the robot's end effector is based on the cartesian_trajectory_tracking package.
- pipeline_launch.launch: Brings up the camera driver, an ar_marker node, OpenPose, keypoints_3d_matching and the frame_transpose node.
- online_trajectory_process.launch: Brings up the node for preprocessing the cartesian trajectory.
- ur3_trajectory_process.launch: Transforms the filtered trajectory to be executable by the UR3.
- cartesian_trajectory_tracking.launch: Node for tracking the input trajectory.
- 3D_visualization.html: Visualization of a 3D movement.
roslaunch openpose_teleoperation reactive_framework.launch
Input modality
-
visual_input: True if using visual input to produce the 3D keypoints either using the real camera or a rosbag. False if using already obtained 3D keypoints.
-
sim: True if use_sim_time needs to be set to true
-
live_camera: True if frames are generated by an RGB-D camera (False if they are generated by rosbags)
NOTE: sim and live_camera arguments need to be set only if visual_input is set to true.
Preprocessing procedure
- case_raw: True for launching
raw_points_process
node - case_bezier: True for launching
piecewise_bezier_process
node - case_downsampling: True for launching
downsampling_interpolation_process
node
Robot
- halt_motion: True to enable the user to halt the robot motion by bringing his left wrist higher than his left shoulder
- p_control: True for P controller (False for PD controller)
- gazebo: True is using gazebo