Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VIO drifts away from pointcloud with realsense d435i #1221

Open
ARK3r opened this issue Oct 16, 2024 · 11 comments
Open

VIO drifts away from pointcloud with realsense d435i #1221

ARK3r opened this issue Oct 16, 2024 · 11 comments

Comments

@ARK3r
Copy link

ARK3r commented Oct 16, 2024

This is the first time I am using rtabmap on my robot and I might be doing something wrong:

When doing visual-inertial odometry AND point cloud generation with a realsense d435i using rtabmap_ros in an outdoor environment where no loop closure is expected for a very long distance, the odometry drifts from the point cloud. This drift happens in all 3 axes: x, y, and z (as there is gradual elevations in the path of the mobile robot). The drift after 2 minutes is so drastic that the odometry is practically unusable as a reference for determining the closest parts of the point cloud relative to the camera (and the robot as a result).

This is a video of where the odometry is relative to the point cloud. The drift in the z-axis is very clear here, however, over time all axes seem to drift. (the tf frame is supposed to be roughly 30 cm off the ground, but here it looks well over 1.5 meters off the ground).

The code is as follows for rtabmap_realsense_bringup.launch.py:

from launch import LaunchDescription
from launch_ros.actions import Node
from launch.actions import DeclareLaunchArgument

def generate_launch_description():

    front_camera_node = Node(
        name='realsense_front',
        namespace='realsense_front',
        package='realsense2_camera',
        executable='realsense2_camera_node',
        parameters=[{
            'camera_name': 'realsense_front',
            'depth_module.profile': '848x480x30',
            'rgb_module.profile': '640x480x6',
            'enable_gyro': True,
            'enable_accel': True,
            'enable_color': True,
            'enable_depth': True,
            'gyro_fps': 200,
            'accel_fps': 200,
            'unite_imu_method': 2,
            'enable_infra1': True,
            'enable_infra2': True,
            'enable_sync': True,
        }],
    )
    
    rtabmap_params = [{
          'frame_id': 'realsense_front_link',
          'subscribe_stereo': True,
          'subscribe_odom_info': True,
          'wait_imu_to_init': True,
          'use_sim_time': False,
          'Rtabmap/DetectionRate': "10",
          'Odom/ResetCountdown': "10",
    }]
    
    rtabmap_remappings=[
        ('imu', '/imu/data'),
        ('left/image_rect', '/realsense_front/infra1/image_rect_raw'),
        ('left/camera_info', '/realsense_front/infra1/camera_info'),
        ('right/image_rect', '/realsense_front/infra2/image_rect_raw'),
        ('right/camera_info', '/realsense_front/infra2/camera_info'),
    ]
    
    rtabmap_odom_node = Node(
        package='rtabmap_odom',
        executable='stereo_odometry',
        output='screen',
        parameters=rtabmap_params,
        remappings=rtabmap_remappings
    )

    rtabmap_slam_node = Node(
        package='rtabmap_slam',
        executable='rtabmap',
        output='screen',
        parameters=rtabmap_params,
        remappings=rtabmap_remappings,
        arguments=['-d']
    )

    imu_madgwick_node = Node(
        package='imu_filter_madgwick', executable='imu_filter_madgwick_node', output='screen',
        parameters=[{
            'use_mag': False, 
            'world_frame':'enu', 
            'publish_tf':False
        }],
        remappings=[
            ('imu/data_raw', '/realsense_front/imu')
        ]
    )
    
    optical_frame_tf = Node(
        package='tf2_ros',
        executable='static_transform_publisher',
        output='screen',
        arguments=['0', '0', '0', '0', '0', '0', 'realsense_front_gyro_optical_frame', 'camera_imu_optical_frame']
    )
    
    # to republish the rtabmap_ros vslam odometry in the tf2 tree for rviz visualization
    odom_to_tf_node = Node(
        package='visual_odometry',
        executable='odom_to_tf.py', 
        output='both',
    )

    return LaunchDescription([
        from_bag_arg,
        front_camera_node,
        rtabmap_odom_node,
        rtabmap_slam_node,
        imu_madgwick_node,
        optical_frame_tf,
        odom_to_tf_node,
    ])

Any help moving forward with debugging this would be appreciated.

@jbodhey
Copy link

jbodhey commented Oct 16, 2024

Hey, i am also trying to do the same from past days but i am unable to get any data on rtabmap_viz . I would appreciate if you could tell me how you did this using this rtabmap_ros . I am using ubuntu 22 ros2 humble and same camera d435i.

Thank you

@ARK3r
Copy link
Author

ARK3r commented Oct 16, 2024

@jbodhey What ros2 distro are you using? What's the code you are running?

@jbodhey
Copy link

jbodhey commented Oct 16, 2024

@ARK3r Ubuntu 22.04.5 LTS- ROS2 humble-devel. I am using code- realsense_d435i_color.launch.py from the example folder.

First i am using " ros2 launch realsense2_camera rs_launch.py enable_gyro:=true enable_accel:=true unite_imu_method:=1 enable_sync:=true"in the terminal

and then "ros2 launch rtabmap_examples realsense_d435i_color.launch.py" in the second terminal

@ARK3r
Copy link
Author

ARK3r commented Oct 16, 2024

@jbodhey Have you checked that all the topics specified here are being published?

Are you running into any errors when you run the script?

@ARK3r
Copy link
Author

ARK3r commented Oct 16, 2024

@jbodhey try this, if the visualization is too laggy change the unite method to '1'

@matlabbe
Copy link
Member

@ARK3r
Normally stereo_odometry should already publish a TF, so you would not need:

    # to republish the rtabmap_ros vslam odometry in the tf2 tree for rviz visualization
    odom_to_tf_node = Node(
        package='visual_odometry',
        executable='odom_to_tf.py', 
        output='both',
    )

I am not sure to understand this:

(the tf frame is supposed to be roughly 30 cm off the ground, but here it looks well over 1.5 meters off the ground).

In the video, the TF would show where the camera is relative to ground. Assuming the ground is the point cloud, I don't see a problem here, it means the camera is really that high?! Do you have a database to share?

@jbodhey I'll recommend to use the "infra" or "stereo" launch files instead of the color one, to get smoother odometry.

@jbodhey
Copy link

jbodhey commented Oct 17, 2024

@jbodhey try this, if the visualization is too laggy change the unite method to '1'

Thanks for your reply. This code works, I need to change unite_imu to 1. But, its not able to show the 3d map properly. I tried both color.py and stereo.py using the code you sent. Below is the image of output and terminal warnings,

IMG_8184

IMG_8185

@matlabbe
Copy link
Member

I'll need to do more tests to see if it is better to revert back default unite_imu_method to 1 (or add an argument to change it).

For your warnings, read this "Lost Odometry" section: https://github.com/introlab/rtabmap/wiki/Kinect-mapping#lost-odometry-red-screens, particularly in your case, don't point the camera to a textureless ceiling.

@ARK3r
Copy link
Author

ARK3r commented Oct 18, 2024

@matlabbe You're correct in that the rtabmap does publish the transforms, I was simply running into the issue because I was not setting sim_times correctly, and now it works fine.

@jbodhey
Copy link

jbodhey commented Oct 20, 2024

@matlabbe

I'll need to do more tests to see if it is better to revert back default unite_imu_method to 1 (or add an argument to change it).

For your warnings, read this "Lost Odometry" section: https://github.com/introlab/rtabmap/wiki/Kinect-mapping#lost-odometry-red-screens, particularly in your case, don't point the camera to a textureless ceiling.

Yes, I tried infra and stereo.py code they work better, and I am able to get the trajectory. Can you tell me how I can know the distance travelled (odometry) by the camera? I want to check the accuracy with the real distance.

@matlabbe
Copy link
Member

@jbodhey You can echo odom_info topic and look at the distance_travelled field:

ros2 topic echo odom_info --field distance_travelled

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants