Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft: Vision-based target estimator (Kalman Filter) #22008

Draft
wants to merge 65 commits into
base: main
Choose a base branch
from

Conversation

JonasPerolini
Copy link
Contributor

@JonasPerolini JonasPerolini commented Aug 25, 2023

Objectives

  • Extend the current landing target estimator to:
    • Have separated dynamics for moving targets.
    • Fuse GNSS (vel / position) and vision.
    • Estimate the full pose of the target (position + orientation)
  • Create a general Kalman Filter framework in which it is simple to include other observations e.g. uwb, irlock
  • Make it simple to use the Kalman Filter for other purposes than precision landing e.g. precision delivery
  • Be robust to temporary vision losses: use the GNSS position of the target (or the mission's landing position) and estimated the GNSS bias using vision
  • Update the precision landing algorithm to also control the orientation if enabled

TODO

  • If target is static, use info in EKF
  • Log relevant topics
  • Simplify logic to start estimators
  • Handle z direction in landing_target_pose msg (e.g. .vz_rel)
  • Log test_ratio in estimator_aid_source3d_s (from Kalman filters beta = _innov / _innov_cov * _innov)
  • Reduce flash size, overflow for fmu_v5x and fmu_v6*
  • Include orientation in precision landing. Linked to Added yaw tracking to LandingTargetEstimator and precland #20487
  • When a GNSS observation is available and we get a secondary source of position measurement, only reset the position and bias part of the filter. Keep the same velocity estimation
  • Handle moving targets
  • Simplify estimator initialization
  • Require a first velocity estimate to start the filter (can come from local_pose)
  • Refactor start/stop/set estimator to simplify the use of the target estimator for different tasks.
  • Test in simulation
  • Flight test
  • Refactor orientation filter to follow position filter (with augmented state)
  • Change logic for fusing GPS relative velocity
  • Rename files, no need to specify decoupled anymore
  • Review code
  • Solve stack issues (remove modules in custom builds if necessary)
  • Update PX4 documentation
  • Once merged, the TARGET_RELATIVE and TARGET_ABSOLUTE MAVLINK messages must be moved from development.xml to common.xml. Then update the custom build images and remove CONFIG_MAVLINK_DIALECT="development"
  • Further flight test on moving targets
  • Place a GNSS on the target in real life. (Could be done by someone who uses the follow-me module)

Changelog Entry

  • TO BE FILLED

Context

Linked to PR #20653

The PR has been flight tested and is ready to be reviewed @dagar @bresch @potaito. Note that due to stack size limitations, it was necessary to create custom build images px4_fmu-v5_visionTargetEst which do not contain every module (e.g. the previous landing target estimator is not included in the build). To reduce the stack size, the vision target estimator only works #if !defined(CONSTRAINED_FLASH). Here are more details.

  1. How to use?
  2. Structure overview
  3. Relevant topics for log analysis
  4. Simulation test results
  5. Real flight test results
  6. Kalman Filter Equations
  7. Additional details
  8. TODOs

How to use?

The vision target estimator (VTE) fuses the available information in a Kalman Filter to land precisely on static and moving targets while also controlling the yaw. The VTE can be enabled by setting VTE_EN = 1. The vision target estimator Position and Yaw control are enabled with VTE_POS_EN = 1 and VTE_YAW_EN = 1 respectively. To control the yaw, PLD_YAW_EN must be set to 1 in the precision landing module. The sensor fusion is performed in Vehicle-carried North-East-Down frame (Vc-NED), a NED frame that moves with the UAV.

Sensors fused (controlled with VTE_AID_MASK):

  • GNSS on target
    • Send GNSS observation via the MAVLINK message TARGET_ABSOLUTE
    • Only tested in Gazebo sim due to lack of hardware (requires a GNSS on the target which communicates to drone).
  • GNSS velocity
    • Uses drone's main GNSS unit via topic vehicle_gps_position (and potentially the MAVLINK message TARGET_ABSOLUTE for moving targets)
    • Tested in sim and real flights.
  • Mission GNSS position
    • Only for static targets. Uses the landing position of the mission as an observation for the target. Note that the bias between the GNSS and the Vision measurement is estimated in the KF. Therefore, as long as the vision detects a target, the landing position does not have to be precise. The goal of the GNSS bias is to be able to land precisely if the vision target is lost during the descent.
    • Tested in sim and real flights.
  • Onboard vision
    • Observation sent via the MAVLINK message TARGET_RELATIVE
    • Tested in sim and real flights.

Static and moving targets (controlled with VTE_MODE):

  • Static: Constant position Kalman Filter model.
    • Extensive (+200) real flight tests fusing Vision + GNSS mission position + GNSS velocity
  • Moving: Constant acceleration Kalman Filter model.
    • Less than 50 real flights, only fusing Vision + GNSS velocity (lack of hardware).

Set which task(s) the VTE should handle (controlled with VTE_TASK_MASK)

  • For now only precision landing is supported but the structure allows to easily start/stop the estimators for other tasks e.g. FollowMe, Precision takeoff, precision delivery, ...

Secondary params that can have a big influence on performance:

  • VTE_POS_NIS_THRE, VTE_YAW_NIS_THRE : Normalized Innovation Squared (NIS) threshold for position and orientation estimators respectively. The lower the thresholds, the more observations will be flagged as outliers.
  • VTE_BTOUT : Time after which the target is considered lost without any new position measurements.
  • VTE_GPS_V_NOISE, VTE_GPS_P_NOISE , VTE_EVP_NOISE : minimum allowed observation noise. If these values are too low and the observation's variance is under-estimates (too low), the system can oscillate.

Structure overview vision_target_estimator:

  • vision_target_estimator_params
    • Contains all the params for the VTE
  • VisionTargetEst
    • Spawns the VTE task. @dagar @bresch note that it was set to a priority of -14 in WorkQueueManager.hpp
      • static constexpr wq_config_t vte{"wq:vte", 6000, -14};
    • Runs both the position and orientation filters at 50Hz (hard coded) depending on the task at hand (VTE_TASK_MASK)
    • Down samples the UAV's acceleration (input of estimators)
    • Subscribes to relevant topics required for the estimators e.g. vehicle_angular_velocity, vehicle_local_position, prec_land_status, ...
  • Position
    • VTEPosition
      • Processes sensor observations (bool VTEPosition::processObs*)
        • publishes status for each obs using the estimator_aid_source3d_s message field.
        • Note that the Normalized Innovation Squared test (NIS) is used to determine whether an observation should be fused or not. It is logged in the test_ratio field for each observation (e.g. vte_aid_gps_pos_mission.test_ratio). The threshold can be defined with the param: VTE_POS_NIS_THRE
      • Handles the sensor fusion of all _target_estimator (Base_KF_decoupled *_target_estimator[nb_directions]) which are defined in the KF_xyzb_* files.
      • Publishes landing_target_pose (for the prec-land module) and vision_target_est_position which contains all the states of the position KF.
  • Orientation same structure as Position but for the Yaw sensor fusion.
  • python_derivation
    • To speed up the KF matrix computations, SymForce is used. This folder contains the python files for generating the code (e.g. KF_xyzb_decoupled_static.py) as well as the generated .h files in the generated folder (e.g. predictCov.h)
    • base_KF_decoupled.h
      • Contains the virtual functions used for both the static and moving KFs
    • KF_xyzb*
      • Contain the Kalman Filter equations
  • documentation
    • Contains a PDF explaining the deriving the Kalman Filter models and the explaining the design choices.

Relevant topics for log analysis:

  • vision_target_est_position , vision_target_est_orientation : Make sure that the states of the KF are coherent. E.g. variance must increase when we lose one observation.
  • vte_acc_input : make sure that the acceleration is properly down sampled
  • vte_aid_* : Check innovation (should resemble white noise), test_ratio should be large in case of outlier, observation_variance, ...

Simulation results

Performed in Gazebo simulation using: PX4/PX4-SITL_gazebo-classic#950. A test is valid if the logs are coherent (checks in section above) and the drone lands on the target. Mainly tested different combinations of sensors, tuning params, the behavior of the KF in case of high sensor noise, and outlier observations.

  • Static target Position + Yaw:

    • Vision
    • Mission GNSS position
    • Target GNSS
    • Vision + Target GNSS
    • Vision + Mission GNSS
    • All the above + GNSS velocity
  • Moving target Position:

    • Vision
    • Mission GNSS position
    • Target GNSS
    • Vision + Target GNSS
    • Vision + Mission GNSS
    • All the above + GNSS velocity

Flight Tests

Static target Position + Yaw control

RealFlight

VTE_PrecisionLanding.MP4
  • Moving target Position: Less tested!
    • Vision + GNSS velocity only

MovingTargetImg

The figure below shows the vision observations (green), the GNSS observations (red), the KF output (blue), and the GNSS bias in the bottom plot. The output of the estimator is smoother than the vision observation. The GNSS bias allows to fuse both vision and GNSS despite having a GNSS mission position offset w.r.t, the target.
VTE_States

The figure below shows the innovations of the Position target estimator for vision and GNSS observations. They resemble white noise indicating good filter performances.
ObservationInnovation

The figure below shows an example of outlier detection using the NIS threshold during a real flight. 71 seconds into the flight, the vision observation is a clear outlier (blue curve in top subplot). As expected, this is caught in the NIS check and the measurement is not fused (see two bottom plots). As a consequence, the KF's yaw estimation (green curve in top subplot) remains consistent despite the outlier measurements.
OutlierRejectionYaw

Kalman Filter Equations:

States

Static target
StaticTargetEq

Moving target
MovingTargetEq

Implementation details:

Coupled vs Decoupled.
CoupledVsDecoupled

Process GNSS obs.
GNSS_Obs

Time delays
TimeDelays

Moving targets param tuning

MovingTargets

  • VTE_MOVING_T_MAX corresponds to K_MAX
  • VTE_MOVING_T_MIN corresponds to K_MIN
  • MPC_Z_V_AUTO_DN is used for the expected downwards velocity.

@dagar
Copy link
Member

dagar commented Aug 26, 2023

Thanks for splitting this out, it should make it much easier and safer to bring it in. I'll review the overall structure.

First for a bit of bikeshedding, instead of VTEST can we try to find something more self descriptive that doesn't sound like it's for testing?

  • vision target estimator => VTARG_EST, VTARG, VIS_TARG?

@dagar dagar requested review from dagar and bresch August 26, 2023 19:40
@JonasPerolini JonasPerolini changed the base branch from main to pr-update_src/modules/mavlink/mavlink September 4, 2023 19:09
@JonasPerolini JonasPerolini changed the base branch from pr-update_src/modules/mavlink/mavlink to main September 4, 2023 19:09
@JonasPerolini JonasPerolini changed the base branch from main to pr-commander_wq September 4, 2023 19:30
@JonasPerolini JonasPerolini changed the base branch from pr-commander_wq to main September 4, 2023 19:30
@JonasPerolini JonasPerolini force-pushed the pr-vision-target-estimator branch from 48b1238 to bca51f9 Compare September 5, 2023 07:21
CONFIG_MAVLINK_DIALECT="development"
CONFIG_MODULES_VISION_TARGET_ESTIMATOR=y
CONFIG_MODULES_LANDING_TARGET_ESTIMATOR=n
CONFIG_MODULES_ROVER_POS_CONTROL=n
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is kind of specific to your own project. You can simply keep that locally or just change it in menuconfig before building the firmware

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Due to the stack size limitations, I've had to disable certain modules to be able to build the frame. This custom frame is here to make sure that future implementations do not break the vision-target-estimator.

Note that eventually the goal is to keep the common Mavlink dialect (once the new Mavlink messages (target_relative and target_absolute) are moved from dev to common).

@JonasPerolini
Copy link
Contributor Author

The PR has been flight tested and is ready to be reviewed @dagar @bresch @potaito. Note that due to stack size limitations, it was necessary to create custom build images px4_fmu-v5_visionTargetEst which do not contain every module (e.g. the previous landing target estimator is not included in the build). To reduce the stack size, the vision target estimator only works #if !defined(CONSTRAINED_FLASH). Here are more details.

  1. How to use?
  2. Structure overview
  3. Relevant topics for log analysis
  4. Simulation test results
  5. Real flight test results
  6. Kalman Filter Equations
  7. Additional details
  8. TODOs

How to use?

The vision target estimator (VTE) fuses the available information in a Kalman Filter to land precisely on static and moving targets while also controlling the yaw. The VTE can be enabled by setting VTE_EN = 1. The vision target estimator Position and Yaw control are enabled with VTE_POS_EN = 1 and VTE_YAW_EN = 1 respectively. To control the yaw, PLD_YAW_EN must be set to 1 in the precision landing module. The sensor fusion is performed in Vehicle-carried North-East-Down frame (Vc-NED), a NED frame that moves with the UAV.

Sensors fused (controlled with VTE_AID_MASK):

  • GNSS on target
    • Send GNSS observation via the MAVLINK message TARGET_ABSOLUTE
    • Only tested in Gazebo sim due to lack of hardware (requires a GNSS on the target which communicates to drone).
  • GNSS velocity
    • Uses drone's main GNSS unit via topic vehicle_gps_position (and potentially the MAVLINK message TARGET_ABSOLUTE for moving targets)
    • Tested in sim and real flights.
  • Mission GNSS position
    • Only for static targets. Uses the landing position of the mission as an observation for the target. Note that the bias between the GNSS and the Vision measurement is estimated in the KF. Therefore, as long as the vision detects a target, the landing position does not have to be precise. The goal of the GNSS bias is to be able to land precisely if the vision target is lost during the descent.
    • Tested in sim and real flights.
  • Onboard vision
    • Observation sent via the MAVLINK message TARGET_RELATIVE
    • Tested in sim and real flights.

Static and moving targets (controlled with VTE_MODE):

  • Static: Constant position Kalman Filter model.
    • Extensive (+200) real flight tests fusing Vision + GNSS mission position + GNSS velocity
  • Moving: Constant acceleration Kalman Filter model.
    • Less than 50 real flights, only fusing Vision + GNSS velocity (lack of hardware).

Set which task(s) the VTE should handle (controlled with VTE_TASK_MASK)

  • For now only precision landing is supported but the structure allows to easily start/stop the estimators for other tasks e.g. FollowMe, Precision takeoff, precision delivery, ...

Secondary params that can have a big influence on performance:

  • VTE_POS_NIS_THRE, VTE_YAW_NIS_THRE : Normalized Innovation Squared (NIS) threshold for position and orientation estimators respectively. The lower the thresholds, the more observations will be flagged as outliers.
  • VTE_BTOUT : Time after which the target is considered lost without any new position measurements.
  • VTE_GPS_V_NOISE, VTE_GPS_P_NOISE , VTE_EVP_NOISE : minimum allowed observation noise. If these values are too low and the observation's variance is under-estimates (too low), the system can oscillate.

Structure overview vision_target_estimator:

  • vision_target_estimator_params
    • Contains all the params for the VTE
  • VisionTargetEst
    • Spawns the VTE task. @dagar @bresch note that it was set to a priority of -14 in WorkQueueManager.hpp
      • static constexpr wq_config_t vte{"wq:vte", 6000, -14};
    • Runs both the position and orientation filters at 50Hz (hard coded) depending on the task at hand (VTE_TASK_MASK)
    • Down samples the UAV's acceleration (input of estimators)
    • Subscribes to relevant topics required for the estimators e.g. vehicle_angular_velocity, vehicle_local_position, prec_land_status, ...
  • Position
    • VTEPosition
      • Processes sensor observations (bool VTEPosition::processObs*)
        • publishes status for each obs using the estimator_aid_source3d_s message field.
        • Note that the Normalized Innovation Squared test (NIS) is used to determine whether an observation should be fused or not. It is logged in the test_ratio field for each observation (e.g. vte_aid_gps_pos_mission.test_ratio). The threshold can be defined with the param: VTE_POS_NIS_THRE
      • Handles the sensor fusion of all _target_estimator (Base_KF_decoupled *_target_estimator[nb_directions]) which are defined in the KF_xyzb_* files.
      • Publishes landing_target_pose (for the prec-land module) and vision_target_est_position which contains all the states of the position KF.
  • Orientation same structure as Position but for the Yaw sensor fusion.
  • python_derivation
    • To speed up the KF matrix computations, SymForce is used. This folder contains the python files for generating the code (e.g. KF_xyzb_decoupled_static.py) as well as the generated .h files in the generated folder (e.g. predictCov.h)
    • base_KF_decoupled.h
      • Contains the virtual functions used for both the static and moving KFs
    • KF_xyzb*
      • Contain the Kalman Filter equations
  • documentation
    • Contains a PDF explaining the deriving the Kalman Filter models and the explaining the design choices.

Relevant topics for log analysis:

  • vision_target_est_position , vision_target_est_orientation : Make sure that the states of the KF are coherent. E.g. variance must increase when we lose one observation.
  • vte_acc_input : make sure that the acceleration is properly down sampled
  • vte_aid_* : Check innovation (should resemble white noise), test_ratio should be large in case of outlier, observation_variance, ...

Simulation results

Performed in Gazebo simulation using: PX4/PX4-SITL_gazebo-classic#950. A test is valid if the logs are coherent (checks in section above) and the drone lands on the target. Mainly tested different combinations of sensors, tuning params, the behavior of the KF in case of high sensor noise, and outlier observations.

  • Static target Position + Yaw:

    • Vision
    • Mission GNSS position
    • Target GNSS
    • Vision + Target GNSS
    • Vision + Mission GNSS
    • All the above + GNSS velocity
  • Moving target Position:

    • Vision
    • Mission GNSS position
    • Target GNSS
    • Vision + Target GNSS
    • Vision + Mission GNSS
    • All the above + GNSS velocity

Flight Tests

Static target Position + Yaw control

RealFlight

VTE_PrecisionLanding.MP4
  • Moving target Position: Less tested!
    • Vision + GNSS velocity only

MovingTargetImg

The figure below shows the vision observations (green), the GNSS observations (red), the KF output (blue), and the GNSS bias in the bottom plot. The output of the estimator is smoother than the vision observation. The GNSS bias allows to fuse both vision and GNSS despite having a GNSS mission position offset w.r.t, the target.
VTE_States

The figure below shows the innovations of the Position target estimator for vision and GNSS observations. They resemble white noise indicating good filter performances.
ObservationInnovation

The figure below shows an example of outlier detection using the NIS threshold during a real flight. 71 seconds into the flight, the vision observation is a clear outlier (blue curve in top subplot). As expected, this is caught in the NIS check and the measurement is not fused (see two bottom plots). As a consequence, the KF's yaw estimation (green curve in top subplot) remains consistent despite the outlier measurements.
OutlierRejectionYaw

Kalman Filter Equations:

States

Static target
StaticTargetEq

Moving target
MovingTargetEq

Implementation details:

Coupled vs Decoupled.
CoupledVsDecoupled

Process GNSS obs.
GNSS_Obs

Time delays
TimeDelays

Moving targets param tuning

MovingTargets

  • VTE_MOVING_T_MAX corresponds to K_MAX
  • VTE_MOVING_T_MIN corresponds to K_MIN
  • MPC_Z_V_AUTO_DN is used for the expected downwards velocity.

TODOs

  • Review code
  • Solve stack issues (remove modules in custom builds if necessary)
  • Update PX4 documentation
  • Once merged, the TARGET_RELATIVE and TARGET_ABSOLUTE MAVLINK messages must be moved from development.xml to common.xml. Then update the custom build images and remove CONFIG_MAVLINK_DIALECT="development"
  • Further flight test on moving targets
  • Place a GNSS on the target in real life. (Could be done by someone who uses the follow-me module)


void KF_orientation_moving::predictCov(float dt)
{
/*
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure if the comment is really useful here.
Also, since the state is small, it might be simpler to write down the matrix multiplication directly.

[h(0)⋅(cov(0;0)⋅h(0) + cov(0;1)⋅h(1)) + h(1)⋅(cov(0;1)⋅h(0) + cov(1;1)⋅h(1)) + r]
*/

_innov_cov = _meas_matrix(0, 0) * (_covariance(0, 0) * _meas_matrix(0, 0) + _covariance(0,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Use symforce or write down the matrix operations to be more readable

@@ -0,0 +1,23 @@
# landing target GPS position in WGS84 coordinates.

uint64 timestamp # time since system start (microseconds)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can't we use the SensorGps message for this?

// Transform quaternion from the target's frame to the TARGET_OBS_FRAME to the yaw relative to NED
const matrix::Quatf q_target(target_relative.q_target);
const matrix::Quatf q_in_ned = q_sensor * q_target;
const float target_yaw_ned = matrix::wrap_pi(matrix::Eulerf(q_in_ned).psi());
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

wrap_pi shouldn't be necessary here as the Euler angles are already between -pi and pi.


namespace vision_target_estimator
{
class KF_orientation_moving : public Base_KF_orientation
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't be Base_KF_orientation a member of KF_orientation_moving instead of a parent? I don't see why this class should inherit from the orientation KF.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Improved for the position filter in 53b45be. Still work to be done in the Orientation filter

virtual void setStateVelVar(float var) = 0;

// Retreive output of filter
virtual float getPosition() { return 0.f; };
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just some things to think about: Is inheritance the right thing here? The orientation filter needs getters and setters for the position, velocity and they variances even if it's not using them.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Improved for the position filter in 53b45be. Still work to be done in the Orientation filter


/**
* @file KF_xyzb_v_decoupled_moving.cpp
* @brief Filter to estimate the pose of moving targets. State: [r, vd, b, at, vt]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's probably not obvious to everyone what r, vd, b, at and vt are.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changed to State: [pos_rel, vel_uav, bias, acc_target, vel_target], State: [pos_rel, vel_rel, bias], State: [yaw, yaw_rate], State: [yaw]

@JonasPerolini
Copy link
Contributor Author

Now that we don't have coupled filters, we can refactor the Base KF class to simplify the code and improve the virtual class inheritance (commit 53b45be).

Main changes:

  1. In VTEPosition.cpp use an AugmentedState defined as pos_rel, vel_uav, bias, acc_target, vel_target (which corresponds to the state of the moving target filter).
    • Each filter has its own state and handles the received AugmentedState accordingly:
      • Static filter: pos_rel, vel_rel = -vel_uav, bias
      • Moving filter: State = AugmentedState
    • This allows to drastically reduce the need for discriminating between filters (if(_target_mode == TargetMode::Stationary){...}) in VTEPosition.cpp and thus simplifies the code.
    • It also clarifies the initialization of the filters and the publication of the targets. Previously, the initial velocity in VTEPosition.cpp depended on the target mode: If the target was static: vel_init = vel_rel and if the target was moving: vel_init = uav_vel. Now we follow the augmented state in VTEPosition.cpp, therefore, vel_init is always equal to vel_uav. The static filter transforms the augmented state to its own state vel_rel = -vel_uav. Similarly, the filter's getters return the augmented state e.g. the static filter transforms its state back to the augmented state.
    • No more unused functions in KF_decoupled_static inherited from base_KF_decoupled.h. Instead of having setPosition(pos_init), setVel(vel_init), ... we now have a single function setState(augmented_state). The static filter simply ignores the fields it does not need.
  2. No need to do Chlesky decompositions anymore, therefore the measurement matrix can be 3x5 instead of 3x15 because the x,y,z directions are fully decoupled.
  3. Use of enums when accessing vectors to clarify the code.

@JonasPerolini JonasPerolini force-pushed the pr-vision-target-estimator branch from 87636a0 to 4372c37 Compare March 12, 2024 12:46
@xdwgood
Copy link
Contributor

xdwgood commented Apr 24, 2024

Create a general Kalman Filter framework in which it is simple to include other observations e.g. uwb, irlock

If this PR is merged, does it mean I can fly indoors (using UWB).Use UWB data to obtain local position estimation

@JonasPerolini
Copy link
Contributor Author

Create a general Kalman Filter framework in which it is simple to include other observations e.g. uwb, irlock

If this PR is merged, does it mean I can fly indoors (using UWB). Use UWB data to obtain local position estimation

Hi @xdwgood, this PR is for a target estimation. It does not change the Extended Kalman filter (EKF) which estimates the local position. However, we do use the target estimation to provide an additional velocity estimate in the EKF which can help keep a local position estimate.

@AMoldskred
Copy link

This would be very useful! Any idea on timeline for merge?

@JonasPerolini
Copy link
Contributor Author

This would be very useful! Any idea on timeline for merge?

Hopefully in the next couple of months @AMoldskred. I will have time to continue working on the final steps starting from August.

@JonasPerolini JonasPerolini force-pushed the pr-vision-target-estimator branch from 3982f09 to 6d3e859 Compare September 23, 2024 12:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants