Skip to content

Commit

Permalink
Merge pull request #110 from Tigul/robokudo-example
Browse files Browse the repository at this point in the history
Robokudo example
  • Loading branch information
Tigul authored Nov 17, 2023
2 parents 6e495df + a866b37 commit f1129a3
Show file tree
Hide file tree
Showing 11 changed files with 333 additions and 88 deletions.
1 change: 1 addition & 0 deletions doc/source/examples.rst
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ Interface Examples

.. nbgallery::
notebooks/giskard
notebooks/robokudo

Object Relational Mapping
=========================
Expand Down
1 change: 1 addition & 0 deletions doc/source/notebooks/robokudo.ipynb
54 changes: 38 additions & 16 deletions examples/interface_examples/giskard.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
"cells": [
{
"cell_type": "markdown",
"id": "fe774ecb",
"id": "6577676e",
"metadata": {},
"source": [
"# Giskard interface in PyCRAM\n",
Expand All @@ -23,14 +23,12 @@
"roslaunch giskardpy giskardpy_hsr_iai.launch\n",
"```\n",
"\n",
"**When working with Giskard and PyCRAM, Giskard should be started first so PyCRAM can initialize the respective module.**\n",
"\n",
"To see what Giskard is doing you can start RViz, there should already be a MarkerArray when starting otherwise you have to add this manually."
]
},
{
"cell_type": "markdown",
"id": "543c39ba",
"id": "a86d4f3b",
"metadata": {},
"source": [
"## How to use the Giskard interface \n",
Expand All @@ -48,7 +46,7 @@
{
"cell_type": "code",
"execution_count": 1,
"id": "266cfa6b",
"id": "50cd3e8d",
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -61,7 +59,29 @@
},
{
"cell_type": "markdown",
"id": "36517d8a",
"id": "e4166109",
"metadata": {},
"source": [
"When you are working on the real robot you also need to initialize the RobotStateUpdater, this module updates the robot in the BulletWorld with the pose and joint state of the real robot. \n",
"\n",
"You might need to change to topic names to fit the topic names as published by your robot. "
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "a3eff6ba",
"metadata": {},
"outputs": [],
"source": [
"from pycram.ros.robot_state_updater import RobotStateUpdater\n",
"\n",
"r = RobotStateUpdater(\"/tf\", \"/joint_states\")"
]
},
{
"cell_type": "markdown",
"id": "f4c4f85d",
"metadata": {},
"source": [
"Now we have a PyCRAM belief state set up, belief state in this case just refeers to the BulletWorld since the BulletWorld represents what we belief the world to look like. \n",
Expand All @@ -74,7 +94,7 @@
{
"cell_type": "code",
"execution_count": null,
"id": "f1a16dd0",
"id": "79fb8a5a",
"metadata": {
"scrolled": true
},
Expand All @@ -87,7 +107,7 @@
},
{
"cell_type": "markdown",
"id": "f502a41d",
"id": "77dbded9",
"metadata": {},
"source": [
"For Giskard everything is connected by joints (this is called [World Tree](https://github.com/SemRoCo/giskardpy/wiki/World-Tree) by Giskard) therefore we can move the robot by using a motion goal between the map origin and the robot base. \n",
Expand All @@ -100,7 +120,7 @@
{
"cell_type": "code",
"execution_count": null,
"id": "e64876a1",
"id": "ec79b6b5",
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -112,7 +132,7 @@
},
{
"cell_type": "markdown",
"id": "7df81875",
"id": "98af5723",
"metadata": {},
"source": [
"Now for the last example: we will move the gripper using full body motion controll. \n",
Expand All @@ -123,7 +143,7 @@
{
"cell_type": "code",
"execution_count": null,
"id": "b8c0900f",
"id": "a255212e",
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -135,7 +155,7 @@
},
{
"cell_type": "markdown",
"id": "55aeb1e3",
"id": "7dfe78ba",
"metadata": {},
"source": [
"That conludes this example you can now close the BulletWorld by using the \"exit\" method."
Expand All @@ -144,7 +164,7 @@
{
"cell_type": "code",
"execution_count": 7,
"id": "1799c70d",
"id": "197aa1f0",
"metadata": {},
"outputs": [],
"source": [
Expand All @@ -153,7 +173,7 @@
},
{
"cell_type": "markdown",
"id": "6391f3b4",
"id": "2ae027ac",
"metadata": {},
"source": [
"## How the Giskard interface works \n",
Expand All @@ -165,12 +185,14 @@
" * avoid_all_collisions\n",
" * allow_self_collision\n",
" * allow_gripper_collision\n",
"The collision mode can be set by calling the respective method, after calling the method the collision mode is valid for the next motion goal afterwards it default back to avoid_all_collisions."
"The collision mode can be set by calling the respective method, after calling the method the collision mode is valid for the next motion goal afterwards it default back to avoid_all_collisions.\n",
"\n",
"There is a ```init_giskard_interface``` method which can be used as a decorator. This decorator should be used on all methods that access the giskard_wrapper, since it assures that the interface is working and checks if Giskard died or the imports for the giskard_msgs failed. "
]
},
{
"cell_type": "markdown",
"id": "385306d9",
"id": "6908a9ab",
"metadata": {},
"source": [
"## Extend the Giskard interface\n",
Expand Down
114 changes: 114 additions & 0 deletions examples/interface_examples/robokudo.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
{
"cells": [
{
"cell_type": "markdown",
"id": "978aa367",
"metadata": {},
"source": [
"# Robokudo interface in PyCRAM\n",
"This notebook should give you an example on how the robokudo interface in PyCRAM works. We will go over how to use the interface, how it is implemented and what can be extended. \n",
"\n",
"First, you need to install RoboKudo by following the installation instructions [here](https://robokudo.ai.uni-bremen.de/installation.html). \n",
"\n",
"RoboKudo depends on a pipline of so-called annotators to process images, depending on your use-case the used annotators will change. But for this simple example we can use the demo pipeline from the [tutorial](https://robokudo.ai.uni-bremen.de/tutorials/run_pipeline.html). You can start RoboKudo by calling \n",
"```\n",
"rosrun robokudo main.py _ae=query\n",
"```\n",
"To get a stream of images to process you need the test bag file, from [here](https://robokudo.ai.uni-bremen.de/_downloads/6cd3bff02fd0d7a3933348060faa42fc/test.bag). You can run this bag file with the following command in the directory where the bag file is. \n",
"```\n",
"rosbag play test.bag --loop\n",
"```\n",
"\n",
"There should now be two windows which show you the result of the annotators. You switch between different annotators by using the arrow keys. \n"
]
},
{
"cell_type": "markdown",
"id": "6c37a831",
"metadata": {},
"source": [
"## How to use the RoboKudo interface in PyCRAM\n",
"Everything related to the RoboKudo interface can be found in the file ```pycram.external_interfaces.robokudo```. The most important method of this file is ```query``` which takes a PyCRAM object designator and calls RoboKudo to try to find a fitting object in the camera view. The other methods are just helper for constructing messages. \n",
"\n",
"Since we are only working with the demo pipeline we will only see how the interface functions but not actually perceive objects in the images."
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "74811bdf",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"[WARN] [1700146040.931888]: RoboKudo is not running, could not initialize RoboKudo interface\n"
]
}
],
"source": [
"from pycram.external_interfaces import robokudo\n",
"from pycram.designators.object_designator import *\n",
"from pycram.enums import ObjectType\n",
"\n",
"object_desig_desc = ObjectDesignatorDescription(types=[ObjectType.BOWL])\n",
"robokudo.query(object_desig_desc)"
]
},
{
"cell_type": "markdown",
"id": "da0dbc43",
"metadata": {},
"source": [
"There was no object detected since the pipline we are using for this example only returns an empty message. However this should give you an impression on how the interface works."
]
},
{
"cell_type": "markdown",
"id": "15a078c8",
"metadata": {},
"source": [
"## How the RoboKudo interface in PyCRAM works\n",
"The interface to RoboKudo is designed around the ROS service that RoboKudo proviedes. The interface takes a ObjectDesignatorDescription which is PyCRAMs symbolic representation of objects and converts it to a RoboKudo ObjectDesignator, the RoboKudo ObjectDesignator is then send to RoboKudo. \n",
"\n",
"The result from this is a list of RoboKudo ObjectDesignators which are possbile matches that were found in the camera FOV. Each of these ObjectDesignators has a list of possible poses that are the result of different pose estimators (currently PyCRAM picks the pose from 'ClusterPoseBBAnnotator' from the list of possible poses).\n",
"PyCRAM then transforms all possible poses for the found Objects to 'map' frame and returns them as a dictionary.\n",
"\n",
"When using the interface the decorator ```init_robokudo_interface``` should be added to all methods that want to send queries to RoboKudo. This decorator makes sure that RoboKudo is running and creates an action client which can be used via the gloabl variable ```robokudo_action_client```."
]
},
{
"cell_type": "markdown",
"id": "74f8c0d2",
"metadata": {},
"source": [
"## How to extend the RoboKudo interface in PyCRAM\n",
"At the moment the RoboKudo interface is tailored toward a specific scenarion in which only two types of objects need to be detected. The distiction is mainly made by the difference in color, which is written to the RoboKudo ObjectDesignator depending on the ObjectType of the PyCRAM ObjectDesignator. \n",
"\n",
"The main point for extension would be to make the interface more universal and extend it to work with other pipelines for example for human detection."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.10"
}
},
"nbformat": 4,
"nbformat_minor": 5
}
6 changes: 6 additions & 0 deletions launch/ik_and_description.launch
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,12 @@
textfile="$(find pycram)/resources/donbot.urdf"/>
</group>

<!-- HSR -->
<group if="$(eval robot == 'hsrb')">
<param name="robot_description"
textfile="$(find pycram)/resources/hsrb.urdf"/>
</group>

<!-- Use Knowrob -->
<group if="$(eval use_knowrob)">
<include file="$(find knowrob)/launch/knowrob.launch"/>
Expand Down
2 changes: 1 addition & 1 deletion resources/hsr.urdf → resources/hsrb.urdf
Original file line number Diff line number Diff line change
Expand Up @@ -1125,7 +1125,7 @@ POSSIBILITY OF SUCH DAMAGE.
</inertial>
</link>
<joint name="gripper_tool_joint" type="fixed">
<origin rpy="0.0 0.0 1.5707963267948963d0" xyz="0.0 0.0 0.23090002"/>
<origin rpy="0.0 0.0 1.5707963267948963" xyz="0.0 0.0 0.23090002"/>
<parent link="wrist_roll_link" />
<child link="gripper_tool_frame"/>
</joint>
Expand Down
5 changes: 3 additions & 2 deletions src/pycram/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,9 @@
import logging
import logging.config

with utils.suppress_stdout_stderr():
import pycram.process_modules
# with utils.suppress_stdout_stderr():
# import pycram.process_modules
import pycram.process_modules

logging.basicConfig(level=logging.WARNING, format='%(levelname)s - %(name)s - Line:%(lineno)d - %(message)s')

Expand Down
Loading

0 comments on commit f1129a3

Please sign in to comment.