Indoor SLAM using Vision and Laser for Micro Aerial Robots
ISVLAMAV is a project that allows low-cost MAVs to perform SLAM in real-scale environments. The system can be implemented over the following hardware platforms:
- ErleCopter
The system use information from the monocular camera and the IMU that are implemented in most of the commercial drones. For this reason, the system needs of a monocular VSLAM algorithm to be run. Any algorithm could be implemented remapping the topics, but we have used:
- LSD-SLAM: https://github.com/tum-vision/lsd_slam
- ORB-SLAM: https://github.com/raulmur/ORB_SLAM
Our project needs also of an algorithm that returns the position of the drone in 3 DoF. We use:
- Hector_slam: https://github.com/tu-darmstadt-ros-pkg/hector_slam
-
ROS We have tested the system in Ubuntu 12.04 with ROS Hydro and in Ubuntu 14.04 with ROS Indigo.
-
Visual SLAM algorithm Any VSLAM method which brings the 6 DoF drone's position could be used. The system will recognice automatically LSD-SLAM and ORB-SLAM.
-
Laser SLAM algorithm Our project needs also of an algorithm that returns the position of the drone in 3 DoF.
-
Eigen The project uses Eigen for the calculations. http://eigen.tuxfamily.org/index.php?title=Main_Page
-
A hardware platform that can carry a Lidar and that has enough computational capacity to process the algorithms For instance, the Erle Copter uses a Raspberry 2B.
Just go to your workspace project, type catkin_make and press enter on your keyboard.
-
Set the desired path for your drone in pid_main_retardos.cpp.
-
Launch your VSLAM algorithm.
-
Launch your Laser SLAM algorithm.
-
Launch the EKF node. rosrun ekf ekf_islamav
-
Make your MAV to take off and launch the PID controller node. rosrun PID pid_main_retardos_islamav