Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fusing RGBD odometry and wheel odometry on ROS2 Humble #1174

Open
hiroshi-automind opened this issue Jun 13, 2024 · 4 comments
Open

Fusing RGBD odometry and wheel odometry on ROS2 Humble #1174

hiroshi-automind opened this issue Jun 13, 2024 · 4 comments

Comments

@hiroshi-automind
Copy link

Hello!

I'm trying to do something like #398 where I want to fuse wheel odometry and rgbd odometry to get a more accurate odometry before feeding it to rtabmap.
image
However I can't seem to get these nodes to communicate with each other properly. Currently I am using this command to launch rtabmap (use_sim_time is true because I'm using ros2 bag playback on a separate development machine):

ros2 launch rtabmap_launch rtabmap.launch.py \
        rtabmap_args:="--delete_db_on_start" \
        odom_topic:=odom \
        rgb_topic:=/oak/rgb/image_raw \
        depth_topic:=/oak/stereo/image_raw \
        camera_info_topic:=/oak/rgb/camera_info \
        imu_topic:=/imu \
        approx_sync:=true \
        use_sim_time:=true \
        rviz:=false \
        rtabmap_viz:=true \
        frame_id:=base_link \
        visual_odometry:=true \
        vo_frame_id:=vo \
        odom_guess_frame_id:=odom \
        odom_frame_id:=vo

and the following for robot_localization (use_sim_time = True is injected into this launch script):

ros2 launch robot_localization ekf.launch.py

This is not currently working, and the mapping results are worse than if I only launched rtabmap without robot_localization. It seems like some transforms or topics are not properly configured, as rqt_graph gives me
Screenshot from 2024-06-12 17-31-12
and ros2 run tf2_tools view_frames gives me
image
It seems like rtabmap is receiving something from robot_localization but not through the intended /odometry/filered topic.

Given the above context, I had a few questions, and please feel free to just redirect me to another resource if it solves my problem and I just didn't find it earlier.

  1. Is my launch script correct? I tried to follow How to make rtabmap use both visual odometer and wheel odometer? #398 and other tutorials as close as possible, but those mainly reference ROS1 and not everything seems to transfer over to ROS2 directly.
  2. Are there any resources for doing this in ROS2?
  3. Is my odom_topic correct? What does this argument do? rgb_topic, depth_topic, etc all seems intuitive as those are just set to the topic name of where the data comes from. But for odom_topic, it seems to affect the odom input topic, rgbd_odom output topic, and the frame_id of the output transform. Using odom_topic:=/odom messed up a lot of things while odom_topic:=odom seems fine. I don't quite understand how this argument works.
  4. If guess_frame_id is already set to wheel odometry, and rgbd_odom just publishes corrections to wheel odometry, then isn't wheel odometry already fused with rgbd_odometry? Do we still need robot_localization to fuse the two together? My intuition suggests that rgbd and wheel odometry can be done completely independently, and then they independently publish transforms odom->base_link and vo->base_link and separately publish odometry messages, and THEN we fuse them. However when I followed How to make rtabmap use both visual odometer and wheel odometer? #398 it (as expected) just published vo->odom->base_link, and I couldn't figure out how the odometry messages were being passed between the nodes.

Thank you very much for your time!

@matlabbe
Copy link
Member

Make sure TF for wheel odometry and visual odometry are not published, robot_localization will publish it. For VO, add publish_tf_odom:=false to rtabmap.launch.py command to disable it. For wheel odometry, this is specific to your robot to how to disable it.

odom_guess_frame_id could be the TF of robot_localization, or not set if you want to disjoint wheel odometry from visual odometry. Setting odom_guess_frame_id makes vo publishing the TF correction (vo frame to wheel frame), but the published topic is still vo->base_link (like wheel odometry topic). But if publish_tf_odom:=false, then only the topic is published.

If you set odom_topic, by default vo_frame_id will be set to same value. Explicitly set vo_frame_id if you don't want the same value in odom_topic and vo_frame_id.

odom_frame_id should be set to TF frame of robot_localization. rtabmap won't subscribe to odom topic published by vo but use TF odom from robot_localization.

@hiroshi-automind
Copy link
Author

Hi matlabbe,

Thank you very much for the response! It seems like I need to take more care in what messages to pass through TFs. I might not be able to disable the robot from publishing wheel odometry (I'm using turtlebot 4). In that case, is it possible to completely disable the usage of TF in rtabmap? If not then I'll need to find a workaround.

@hiroshi-automind
Copy link
Author

Hi, sorry for the onslaught of questions. Is rtabmap-ros different from rtabmap standalone? The standalone version seems to only rely on the input from a rgbd/stereo camera, but the ros version seems to depend more on wheel odometry.

@matlabbe
Copy link
Member

We rely on TF to get more accurate odometry corresponding on input data stamps (image or lidar). If the odometry topic is linked to rtabmap node, then just the covariance is actually used from that topic. There is a way to ignore using TF if it doesn't exist, set wait_for_transform to 0 to avoid rtabmap waiting for that TF. It will then use the stamp from odometry topic instead of the data as reference.

To avoid publishing any TF, set publish_tf to false for rtabmap's odometry nodes and rtabmap node. To do this, you may not be able to use rtabmap.launch.py, but would have to use the nodes directly (here are some examples using the nodes directly in a custom launch file).

For turtlebot4, I am currently working on an example with nav2 (replacing slam_toolbox), though I have still some issues I don't understand with the simulator before I officially release the example. You may try this:

ros2 launch rtabmap_launch rtabmap.launch.py \
       rtabmap_viz:=true \
       subscribe_scan:=true \
       rgbd_sync:=true \
       depth_topic:=/oak/stereo/image_raw \
       camera_info_topic:=/oak/rgb/camera_info \
       rgb_topic:=/oak/rgb/image_raw \
       visual_odometry:=false  \
       approx_sync:=true \
       approx_rgbd_sync:=false \
       guess_odom_frame_id:=odom \
       icp_odometry:=true \
       odom_topic:="icp_odom" \
       namespace:="/" \
       qos:=2 \
       odom_log_level:=warn \
       rtabmap_args:="--delete_db_on_start --Reg/Strategy 1 --Reg/Force3DoF true"

If you don't have the lidar, you may do this instead:

ros2 launch rtabmap_launch rtabmap.launch.py \
       rtabmap_viz:=true \
       rgbd_sync:=true \
       depth_topic:=/oak/stereo/image_raw \
       camera_info_topic:=/oak/rgb/camera_info \
       rgb_topic:=/oak/rgb/image_raw \
       visual_odometry:=true  \
       approx_sync:=false \
       approx_rgbd_sync:=false \
       guess_odom_frame_id:=odom \
       odom_topic:="vo" \
       namespace:="/" \
       qos:=2 \
       odom_log_level:=warn \
       rtabmap_args:="--delete_db_on_start --Reg/Force3DoF true"

I am not sure about the qos value for the real robot, you can remove it to use default.

You can use rtabmap without wheel odometry in ROS, though most robots have already wheel odometry or something equivalent that is useful to use. When we set guess_frame_id for rtabmap's odometry nodes (from rtabmap.launch.py, use guess_odom_frame_id), if rtabmap's odometry gets lost, it will redirect wheel odometry output instead so that the robot has always valid odometry (though less accurate till rtabmap's odometry can work again). Using wheel odometry as guess frame is also used as prediction when doing feature matching or lidar registration. And I think the most useful part is that sometime robot's odometry cannot be easily disabled, so when using it as guess, rtabmap will publish its odom TF in a way that not breaks the TF tree. For example, after basic bringup the robot (without any mapping nodes), the TF tree would look like this:

odom -> base_link -> camera_link -> camera_optical_link...

Doing the second example above, the TF tree will look like this:

map -> vo -> odom -> base_link -> camera_link -> camera_optical_link...

where vo->odom is the visual odometry correction of the wheel odometry.

Rtabmap standalone doesn't have odometry input because it is mostly used as handled scanning and not on a robot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants