Robust object tracking in 3D by fusing ultra-wideband and vision
This project was a semester project I did at the Advanced Interactive Technology Lab (ait.ethz.ch) at the Swiss Federal Institute of Technology (ETH) durgin my master studies in Information Technology and Electrical Engineering.
The report that I had to write for this semester project is written in LaTeX and located in the folder doc.
The presentation I gave is located in the folder presentation.
ArUco sample, used to test ArUco python functionality.
Python script for the camera calibration with OpenCV.
Matlab scripts to match the coordinate systems with the Kabsch algorithm.
KCF tracker base implementation without ROS.
Python script for the offline evaluation of the ekf. Measures the rmse of the uwb and the ekf with respect to the VICON data.
Publishes images, recorded with a camera, as ROS messages.
Publishes the information provided by the UWB system as ROS messages.
Receives ROS messages from the uwb and publish_image node, detects ArUco markers and saves the locations provided by both systems into a hdf5 file.
Receives ROS messages from the VICON system and publish_image node, detects ArUco markers and saves the locations provided by both systems into a hdf5 file.
The KCF tracker publishes the 2D pixel coordinates as ROS message as well as boolean ROS message which indicate, if the object is lost or not.
The EKF receives ROS messages from the vision_tracker and uwb node and publishes the fused positions as ROS messages.
Displays the positions provided by the UWB, the vision tracker and by the EKF on top of the picture.
Records the required information to perform the evaluation.