In this ROS package the appropriate scripts are included corresponding to MS1 demo. Specifically, an integration scheme for robot grasping using existing ROS modules is provided from NTUA CSL Team. The ROS packages which are used are:
In this link, you can watch the relative video that presents the preliminary simulation results of the described scheme.
The package dependeces are:
-
MoveIt:
sudo apt-get install ros-indigo-moveit
For more details refer to MoveIt Installation Instructions.
-
Tiago Robot Simulation Packages:
Follow the instractions here in order to install the required packages for Tiago Simulation. Important Note: When the Tiago Simulation packages are install, the following changes should be done in the
sensors_rgbd.yaml
, which is located inside thetiago_moveit_config/config
folder, for the simulation purposes:#sensors: # - sensor_plugin: occupancy_map_monitor/PointCloudOctomapUpdater # point_cloud_topic: /xtion/depth_registered/points # max_range: 5.0 # point_subsample: 1 # padding_offset: 0.1 # padding_scale: 1.0 # filtered_cloud_topic: filtered_cloud # NTUA modified sensors: sensor_plugin: occupancy_map_monitor/PointCloudOctomapUpdater point_cloud_topic: /robot_0/xtion/depth_registered/points max_range: 2.0 point_subsample: 1 padding_offset: 0.1 padding_scale: 1.0 filtered_cloud_topic: filtered_cloud
-
Object Recognition Kitchen:
Installation instructions are described in this page.
-
pykdl_utils:
Kinematics and geotry utilities for KDL. See the realive Documentation here. To install just clone the repository in the tiago_public_ws workspace and build it:
cd ~/tiago_public_ws/src && git clone https://github.com/gt-ros-pkg/hrl-kdl
cd .. && catkin_make
-
ms1_grasping:
Clone this package inside the tiago_public_ws workspace and build it!
cd ~/tiago_public_ws && catkin_make
If the ORK algorithms are used for object recognition, a database, which contains the objects, models and training data, should be created. To properly create and configure this database, the instruction described in this link should be followed.
When the database is created, we should store the desired objects in the db in order that ORK recognize them. Steps:
-
Creatin an Object in the DB
rosrun object_recognition_core object_add.py -n cylinder -d "A cylinder" --commit
-
Adding a Mesh for the Object
rosrun object_recognition_core mesh_add.py YOUR_OBJECT_ID `rospack find ms1_grasping`/data/cylinder_5_5_20/cylinder_5_5_20.stl --commit
-
Visualizing Objects (Optionally)
If you want to visualize the object in the db click here
-
Deleting an Object (Optionally)
rosrun object_recognition_core object_delete.py OBJECT_ID
To run and test the algorithm you should follow the following steps:
-
Start Simulation Environment of the demo corresponding to Milestone 1:
roslaunch c4r_simulation simulate_scenario.launch scenario:=ms1_demo rviz:=ms1_ntua
-
Bring Up the Object Detection Algorithm, Object Recognized Publisher and Move To Table Nodes:
roslaunch ms1_grasping move_to_table.launch
Important Note: Replace the default value of the
id
argument in themove_to_table.launch
file with the one which corresponds to the object created in your CouchDB database!!!<launch> <!-- Object Id In CouchDB --> <arg name="id" default="4e37b85ae1a07dcb5243ee571d00841f"/>
When the above nodes are launched, the Tiago robot goes close to the table and detects the object (cylinder). Then, the object_recognized_publisher node broadcasts the cylinder pose to
tf
and publishes its pose and point cloud using the following topics: -
Open the Gripper and Move It Close to the Object using MoveIt algorithm:
roslaunch ms1_grasping reaching_to_grasp.launch
When The goal is achieved the gripper closes.
-
Control Gripper Fingers (Optionally):
roslaunch ms1_grasping tiago_gripper_control.launch
- Interface With KTH
-
The navigation controller (currently using
move_base
) wait until the KTH planner publishes a goal pose in the relative topic and provides the status of the goal poses. The topics subscribed and published are described bellow:-
Subscribing:
move_base/goal
(Msg Type: move_base_msgs/MoveBaseActionGoal.Wait until the KTH publish a goal pose to the topic.
-
Publishing:
move_base/result
(Msg Type: move_base_msgs/MoveBaseActionResult)It provides status information about the goals that are sent to the controller. Status message definition can be found here. If the NTUA navigation algorithm is used, the corresponding msgs will be adapted accordingly to the
move_base
ones.
-
-
When the KTH planner sends the word message that corresponds to grasp to the relative topic, for example graspit, the
grasping algorithm
begins. The topics which relates with KTH planner are:- Subscribing:
/action/id
(Msg Type:StringStamped
) - Publishing:
/action/result
(Msg Type:StringStamped
)
Algorithm Architecture Example :
class GraspingClass: def __init__(self): # Subscribers rospy.Subcriber("/object/pose", PointStamped, self.GetObjecPose) rospy.Subscriber("/objec/points", PointCloud2, self.GetobjecPoints) rospy.Subscriber("/robot_0/action/id", StringStamped, self.ActionListener) # Publisher self.ResultPublisher = rospy.Publisher("/robot_0/action/result",StringStamped, queue_size = 10) # Initialize Variables self.points = PointCloud2() self.object_pose = PoseStamped() self.object_points = PointCloud2() def ActionListener(self,msg): if (msg.data == "graspit"): self.Grasping_Scheme() def GetobjecPoints(self,self.object_points) def GetobjecPose(self,self.object_pose) def Grasping_Scheme(self): areas_pose = self.Compute_Grasping_Areas() result = self.Grasp_Object(areas_pose) msg = StringStamped() msg.header.stamp = rospy.get_time() msg.data = result self.ResultPublisher.publish(msg) def Compute_Grasping_Areas(self) def Grasp_Object(self,data) # End of Class if __name__ =='__main__': try: rospy.init_node('grasping_scheme', anonymous=True) rate = rospy.Rate(10) grsp = GraspingClass() while not rospy.is_shutdown(): rate.sleep() rospy.spin() except rospy.ROSInterruptException: pass
- Subscribing:
-
Leader Follower Scheme
for the object cooperative transportation architecture will be designed similar to the grasping algorithm one.The topics which relates with KTH planner will be:
- Subscribing:
/action/id
(Msg Type:StringStamped
) - Publishing:
/action/result
(Msg Type:StringStamped
)
When the message of
/action/id
is, for example col_transp, the controller is activated. The/action/result
msg value depends on the process state. - Subscribing:
- Interface With FORTH
-
The
grasping algorithm
uses the partial point cloud of the object (the points of the raw point cloud that relate to the object) and its pose.- Subscribing topics:
rospy.Subcriber("/object/pose", PointStamped, self.GetObjecPose) rospy.Subscriber("/objec/points", PointCloud2, self.GetobjecPoints)
Information of the object mesh may be needed too!
The human pose may be needed in cooperative transportation scheme but for now it is not used.
- Interface With PAL
- A force torque sensor mounted on the wrist should feed the cooperative transportation control scheme with the appropriate data.
- Subscribing topics:
rospy.Subcriber("/ft_sensor", WrenchStamped, self.GetEEWrench)