-
Notifications
You must be signed in to change notification settings - Fork 551
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to get the rtabmap slam working and correct the pose estimate for base_link using Robot_localization Odom #1149
Comments
Update:I just turned off everything and tested with just rtab's odometry with same params as above and rtab' slam. No EKF and No odometry from sensor. I again did that sharp turn and this time robot appears to behave what I mentioned above. ( It is turning instead of sliding sideways which it was doing previously). This means that with my sensor and steering angle fused wrongly with ekf. @matlabbe could you provide some insight on this based on above params. I can say that odometry from my sensor is good for translation. I just need to handle the case where rtab's VO is lost and for that just using steering angle fused in ekf and made EKF not to believe on steering much. Is this something to do on how I am fusing odmetry data from sensor mounted on the base_link for which robot_state_publihser publishes the static transform and also for the camera. |
I tried to fuse my sensor's translation only (pose.x and pose.y) and use rtab's rgbd-odometry But as soon as I try to make the U-turn, the robot facing is correct this time but there is huge difference between EKF'S estimated pose and rgbd's odometry. EKF saying 2.4, 2.5, something values while the rgbd odometry reports 0.24, 0.26 for I believe I am getting confused with the setup and EKF is making things wierd. Also, in rgbd odometry, the parameter |
Update: I played with my sensor output and came to know that y values of my sensor do not behave well with REP103, even after negating it. For now, I just passed the pose.x from my sensor data and absolute pose.x pose.y and orientation.yaw from RGBD odometry together in the EKF. Now, I performed the same U turn, and the side drift is gone, a bit more tweaking required for my sensor data and VO to output nearly same measurements for pose.x. If somebody is struggling with the same setup of fusion with EKF, just follow below steps.
The other way is to just use RGBD odometry and using the
|
I think the optical flow odometry and steering angle is wrongly set in their respective odometry topic. Is it really giving pose.x and pose.y globally or relative to base frame? You may want to set the delta value as a velocity in base frame instead of pose. For the steering angle, same thing, the steering angle is independent of the global yaw, you would have to transform it as a yaw velocity instead of a yaw orientation. |
The optic flow sensor is static to base link and robot state publisher do the static transform between optical flow frame and base link as parent frame. The odometry message which i generate, I set frame id as odom and child frame id as base link. I also feed steering angle in the same odometry message in the yaw field. Then this message is fed to the EKF to handle some uncertainity. @matlabbe I tried with just pose.x from this odometry message and absolute pose from RGBD odometry to EKF. Is it wrong? I also tested with velocity in x from my this sensor odometry and absolute pose from RGBD, they seem to have different values but the side drifting is gone. Now the values increase/ decrease in same direction. |
By pose.x and pose.y globally means ? The message is in odom frame and rtab's slam does the correction in map frame over the topic / localisation_pose. I also checked my sensor's y values it is sometimes while turning right reports positive values which is wrong as per the REP105. So I turned off the y from my sensor and using RGBD's absolute pose info. @matlabbe |
By globally, I meant in reference to same fixed frame (e.g., odom). Independently of the orientation of the robot, x-y values refer to same fixed frame. For example, if robot moves forward 1 meter (x=1,y=0), turn 90 deg clockwise and move another 1 meter forward, the resulting pose should be x=1,y=1, not x=2, y=0. If your optical flow sensor gives x-y delta value from previous pose, transform them as velocity and fuse velocity instead. For the steering angle, if it says "+30 deg", it is relative to base frame, not odom frame, so it cannot be used "as is" in pose of odometry message. It could be translated to a velocity and use the twist in odometry message instead and fuse vyaw. In definition of the odometry message, you will see that pose is in |
Ok. What I am doing is taking those dx and dy and feeding the absolute pose to For the steering angle, I find the steering angle from my feedback messages from the linear actuator and then using:
Then generate the odometry message:
The |
It should be I would assume vx is in base frame, you cannot just do |
i have been going through rtabmap's documentation and deployed on the real robot for doing slam. I have RGBD intel D435 camera. My setup involves translation pose (x, y) from optical flow sensor, using steering angle for yaw but it might not be accuarte so fusing rtabmap's rgbd_odometry with just
Vyaw
in theEKF
. I just wanted to add the steering angle to handle the case where visual odometry lost for my Ackermann steered robot.I got this setup by following :
Get the optical flow sensor and steering angle to publish information on a single odometry message with pose.x, pose.y from optical flow sensor and orientation.z from steering angle, this makes an odometry message. The sensor is mounted on base_link sp static transform is published by robot state publisher.
Now, using rtbamap's vo to just get the yaw information and I am just using velocity in yaw.
publsih_tf
is set to false.In the
robot_localization
ekf, I feed pose.x, pose.y and orinetation.yaw from sensor and steering angle and Vyaw from visual odometry and this is verified that just ekf is publishingodom->base_link
transform.I launch the rtabmap slam with subscribing to
odometry/filtered
topic.I visualize in rviz2, when going forward, it seems okish but as soon as I turning it seems the robot is turning but sometimes touches the obstacle meaning very close to the black area marked in 2D occupancy grid map but in real it is a bit far away.
Monitoring the yaw angle, from EKF , VO and steering, I can say that EKF no believing steering heavily because I set the measurement covariance in my odometry message for
absolute yaw
to10000.0
, and rtabmap slam outputslocalization/pose
which also seems somewhere in between ofVO
's andEKF
's yaw.Is this the correct way for setting this scenario. Is rtabmap slam actually correcting the pose because sometimes I see the robot jumping slightly which I thought means correcting the drift?
Issue
If robot makes a sharp turn to the right, say nearly in a orthogonal way (going forward and turning with full steering lock), in rviz, I see it starts moving sideways. If I start to take a full U-turn and now the robot facing the back area which is not mapped, in rviz, I see robot going backwards while in real it going forward.
Is it normal case of visualization or something is wrong here?
params for rtabmap's RGBD Odometry
`
`
params for rtabmap's Slam
`
`
Main params for EKF
`
`
Checked
odom->base_link
except forEKF
Any suggestion to get this working properly will be highly appreciated. @matlabbe
The text was updated successfully, but these errors were encountered: