-
Notifications
You must be signed in to change notification settings - Fork 13.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Draft: Vision-based target estimator (Kalman Filter) #22008
base: main
Are you sure you want to change the base?
Conversation
Thanks for splitting this out, it should make it much easier and safer to bring it in. I'll review the overall structure. First for a bit of bikeshedding, instead of
|
48b1238
to
bca51f9
Compare
CONFIG_MAVLINK_DIALECT="development" | ||
CONFIG_MODULES_VISION_TARGET_ESTIMATOR=y | ||
CONFIG_MODULES_LANDING_TARGET_ESTIMATOR=n | ||
CONFIG_MODULES_ROVER_POS_CONTROL=n |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is kind of specific to your own project. You can simply keep that locally or just change it in menuconfig before building the firmware
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Due to the stack size limitations, I've had to disable certain modules to be able to build the frame. This custom frame is here to make sure that future implementations do not break the vision-target-estimator.
Note that eventually the goal is to keep the common Mavlink dialect (once the new Mavlink messages (target_relative and target_absolute) are moved from dev to common).
af8c29f
to
a7b8ece
Compare
4d7debd
to
1f31288
Compare
The PR has been flight tested and is ready to be reviewed @dagar @bresch @potaito. Note that due to stack size limitations, it was necessary to create custom build images
How to use?The vision target estimator (VTE) fuses the available information in a Kalman Filter to land precisely on static and moving targets while also controlling the yaw. The VTE can be enabled by setting Sensors fused (controlled with
Static and moving targets (controlled with
Set which task(s) the VTE should handle (controlled with
Secondary params that can have a big influence on performance:
Structure overview
|
src/modules/vision_target_estimator/Orientation/KF_orientation_moving.cpp
Outdated
Show resolved
Hide resolved
|
||
void KF_orientation_moving::predictCov(float dt) | ||
{ | ||
/* |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure if the comment is really useful here.
Also, since the state is small, it might be simpler to write down the matrix multiplication directly.
[h(0)⋅(cov(0;0)⋅h(0) + cov(0;1)⋅h(1)) + h(1)⋅(cov(0;1)⋅h(0) + cov(1;1)⋅h(1)) + r] | ||
*/ | ||
|
||
_innov_cov = _meas_matrix(0, 0) * (_covariance(0, 0) * _meas_matrix(0, 0) + _covariance(0, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use symforce or write down the matrix operations to be more readable
src/modules/vision_target_estimator/Orientation/KF_orientation_moving.h
Outdated
Show resolved
Hide resolved
src/modules/vision_target_estimator/Orientation/KF_orientation_moving.h
Outdated
Show resolved
Hide resolved
src/modules/vision_target_estimator/Orientation/KF_orientation_moving.cpp
Outdated
Show resolved
Hide resolved
@@ -0,0 +1,23 @@ | |||
# landing target GPS position in WGS84 coordinates. | |||
|
|||
uint64 timestamp # time since system start (microseconds) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can't we use the SensorGps message for this?
// Transform quaternion from the target's frame to the TARGET_OBS_FRAME to the yaw relative to NED | ||
const matrix::Quatf q_target(target_relative.q_target); | ||
const matrix::Quatf q_in_ned = q_sensor * q_target; | ||
const float target_yaw_ned = matrix::wrap_pi(matrix::Eulerf(q_in_ned).psi()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wrap_pi shouldn't be necessary here as the Euler angles are already between -pi and pi.
|
||
namespace vision_target_estimator | ||
{ | ||
class KF_orientation_moving : public Base_KF_orientation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't be Base_KF_orientation
a member of KF_orientation_moving
instead of a parent? I don't see why this class should inherit from the orientation KF.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Improved for the position filter in 53b45be. Still work to be done in the Orientation filter
src/modules/vision_target_estimator/Orientation/VTEOrientation.cpp
Outdated
Show resolved
Hide resolved
virtual void setStateVelVar(float var) = 0; | ||
|
||
// Retreive output of filter | ||
virtual float getPosition() { return 0.f; }; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just some things to think about: Is inheritance the right thing here? The orientation filter needs getters and setters for the position, velocity and they variances even if it's not using them.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Improved for the position filter in 53b45be. Still work to be done in the Orientation filter
src/modules/vision_target_estimator/Position/KF_xyzb_decoupled_static.cpp
Outdated
Show resolved
Hide resolved
|
||
/** | ||
* @file KF_xyzb_v_decoupled_moving.cpp | ||
* @brief Filter to estimate the pose of moving targets. State: [r, vd, b, at, vt] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's probably not obvious to everyone what r, vd, b, at and vt are.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changed to State: [pos_rel, vel_uav, bias, acc_target, vel_target]
, State: [pos_rel, vel_rel, bias]
, State: [yaw, yaw_rate]
, State: [yaw]
src/modules/vision_target_estimator/Position/KF_xyzb_v_decoupled_moving.cpp
Outdated
Show resolved
Hide resolved
src/modules/vision_target_estimator/Position/KF_xyzb_v_decoupled_moving.cpp
Outdated
Show resolved
Hide resolved
...n_target_estimator/Position/python_derivation/generated/decoupled_moving_xyzb_v/predictCov.h
Outdated
Show resolved
Hide resolved
Now that we don't have coupled filters, we can refactor the Base KF class to simplify the code and improve the virtual class inheritance (commit 53b45be). Main changes:
|
87636a0
to
4372c37
Compare
If this PR is merged, does it mean I can fly indoors (using UWB).Use UWB data to obtain local position estimation |
Hi @xdwgood, this PR is for a target estimation. It does not change the Extended Kalman filter (EKF) which estimates the local position. However, we do use the target estimation to provide an additional velocity estimate in the EKF which can help keep a local position estimate. |
This would be very useful! Any idea on timeline for merge? |
Hopefully in the next couple of months @AMoldskred. I will have time to continue working on the final steps starting from August. |
3982f09
to
6d3e859
Compare
Objectives
TODO
test_ratio
inestimator_aid_source3d_s
(from Kalman filtersbeta = _innov / _innov_cov * _innov
)decoupled
anymoredevelopment.xml
tocommon.xml
. Then update the custom build images and removeCONFIG_MAVLINK_DIALECT="development"
Changelog Entry
Context
Linked to PR #20653
The PR has been flight tested and is ready to be reviewed @dagar @bresch @potaito. Note that due to stack size limitations, it was necessary to create custom build images
px4_fmu-v5_visionTargetEst
which do not contain every module (e.g. the previous landing target estimator is not included in the build). To reduce the stack size, the vision target estimator only works#if !defined(CONSTRAINED_FLASH)
. Here are more details.How to use?
The vision target estimator (VTE) fuses the available information in a Kalman Filter to land precisely on static and moving targets while also controlling the yaw. The VTE can be enabled by setting
VTE_EN = 1
. The vision target estimator Position and Yaw control are enabled withVTE_POS_EN = 1
andVTE_YAW_EN = 1
respectively. To control the yaw,PLD_YAW_EN
must be set to 1 in the precision landing module. The sensor fusion is performed in Vehicle-carried North-East-Down frame (Vc-NED), a NED frame that moves with the UAV.Sensors fused (controlled with
VTE_AID_MASK
):vehicle_gps_position
(and potentially the MAVLINK message TARGET_ABSOLUTE for moving targets)Static and moving targets (controlled with
VTE_MODE
):Set which task(s) the VTE should handle (controlled with
VTE_TASK_MASK
)Secondary params that can have a big influence on performance:
VTE_POS_NIS_THRE
,VTE_YAW_NIS_THRE
: Normalized Innovation Squared (NIS) threshold for position and orientation estimators respectively. The lower the thresholds, the more observations will be flagged as outliers.VTE_BTOUT
: Time after which the target is considered lost without any new position measurements.VTE_GPS_V_NOISE
,VTE_GPS_P_NOISE
,VTE_EVP_NOISE
: minimum allowed observation noise. If these values are too low and the observation's variance is under-estimates (too low), the system can oscillate.Structure overview
vision_target_estimator
:vision_target_estimator_params
VisionTargetEst
WorkQueueManager.hpp
static constexpr wq_config_t vte{"wq:vte", 6000, -14};
VTE_TASK_MASK
)vehicle_angular_velocity
,vehicle_local_position
,prec_land_status
, ...Position
VTEPosition
bool VTEPosition::processObs*
)estimator_aid_source3d_s
message field.test_ratio
field for each observation (e.g.vte_aid_gps_pos_mission.test_ratio
). The threshold can be defined with the param:VTE_POS_NIS_THRE
_target_estimator
(Base_KF_decoupled *_target_estimator[nb_directions]
) which are defined in theKF_xyzb_*
files.landing_target_pose
(for the prec-land module) andvision_target_est_position
which contains all the states of the position KF.Orientation
same structure asPosition
but for the Yaw sensor fusion.python_derivation
KF_xyzb_decoupled_static.py
) as well as the generated .h files in thegenerated
folder (e.g.predictCov.h
)base_KF_decoupled.h
KF_xyzb*
documentation
Relevant topics for log analysis:
vision_target_est_position
,vision_target_est_orientation
: Make sure that the states of the KF are coherent. E.g. variance must increase when we lose one observation.vte_acc_input
: make sure that the acceleration is properly down sampledvte_aid_*
: Checkinnovation
(should resemble white noise),test_ratio
should be large in case of outlier,observation_variance
, ...Simulation results
Performed in Gazebo simulation using: PX4/PX4-SITL_gazebo-classic#950. A test is valid if the logs are coherent (checks in section above) and the drone lands on the target. Mainly tested different combinations of sensors, tuning params, the behavior of the KF in case of high sensor noise, and outlier observations.
Static target Position + Yaw:
Moving target Position:
Flight Tests
Static target Position + Yaw control
Representative logs:
Results: the drone consistently landed on a small table without falling (i.e. below a 20cm error).
VTE_PrecisionLanding.MP4
The figure below shows the vision observations (green), the GNSS observations (red), the KF output (blue), and the GNSS bias in the bottom plot. The output of the estimator is smoother than the vision observation. The GNSS bias allows to fuse both vision and GNSS despite having a GNSS mission position offset w.r.t, the target.

The figure below shows the innovations of the Position target estimator for vision and GNSS observations. They resemble white noise indicating good filter performances.

The figure below shows an example of outlier detection using the NIS threshold during a real flight. 71 seconds into the flight, the vision observation is a clear outlier (blue curve in top subplot). As expected, this is caught in the NIS check and the measurement is not fused (see two bottom plots). As a consequence, the KF's yaw estimation (green curve in top subplot) remains consistent despite the outlier measurements.

Kalman Filter Equations:
Static target

Moving target

Implementation details:
Coupled vs Decoupled.

Process GNSS obs.

Time delays

Moving targets param tuning
VTE_MOVING_T_MAX
corresponds to K_MAXVTE_MOVING_T_MIN
corresponds to K_MINMPC_Z_V_AUTO_DN
is used for the expected downwards velocity.