Skip to content

A UAV autonomous navigation autopilot, made with ROS, MAVROS, PX4 and Gazebo. Check out my master thesis in the repo for more info.

Notifications You must be signed in to change notification settings

rhidra/autopilot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PX4 autopilot

Python autopilot using the PX4 firmware toolchain, Gazebo, ROS, MAVROS and the MAVLink protocol.

🚨 Looking for an engineer ? I am currently looking for a job, take a look my portfolio ! 🙋‍♂️

Intro

This work has been done by Rémy Hidra during his Master of Research at the Shanghai Jiao Tong University of China.

This is build as a ROS package. You can also find a few external experiments and data collection in the analysis directory. The module feature C++ and Python code. It contains mainly:

  • A global planner, using the Phi* path planning algorithm (in Python).
  • A local planner, using a hybrid motion primitives optimization method (in Python and improved in C++).
  • A main navigation controller, to transmit data to MAVROS.

If you are interested in my work, you can check out my thesis to learn more about the theory and the implementation. The ROS, PX4, MAVROS and most other component installation is detailed in this Readme.

If you have any other questions, you can contact me by email, linkedin or directly on Github.

Installation

Setting up the toolchain

Install the PX4 toolchain, ROS and Gazebo, using the PX4 scripts. You need to execute the ubuntu_sim_ros_melodic.sh script to install ROS Melodic and MavROS. Then, execute the ubuntu.sh script to install simulators like Gazebo and the rest of the PX4 toolchain. You need to use Ubuntu 18, otherwise it will not work !

You may need to clone the Firmware. We cloned it in the home folder.

The source code is in ~/catkin_ws/src/, with some common modules like MAVROS, and the custom autopilot from this repository. You need to always build the packages with catkin build.

In the .bashrc, you need the following lines. Adapt $fw_path and $gz_path according to the location of the PX4 firmware.

source /opt/ros/melodic/setup.bash
source ~/catkin_ws/devel/setup.bash

fw_path="$HOME/Firmware"
gz_path="$fw_path/Tools/sitl_gazebo"
source $fw_path/Tools/setup_gazebo.bash $fw_path $fw_path/build/px4_sitl_default
export ROS_PACKAGE_PATH=$ROS_PACKAGE_PATH:$fw_path
export ROS_PACKAGE_PATH=$ROS_PACKAGE_PATH:$gz_path

# Set the plugin path so Gazebo finds our model and sim
export GAZEBO_PLUGIN_PATH=${GAZEBO_PLUGIN_PATH}:$gz_path/build
# Set the model path so Gazebo finds the airframes
export GAZEBO_MODEL_PATH=${GAZEBO_MODEL_PATH}:$gz_path/models
# Disable online model lookup since this is quite experimental and unstable
export GAZEBO_MODEL_DATABASE_URI=""
export SITL_GAZEBO_PATH=$gw_path

Compile and run a vanilla simulation

First, you need to compile the PX4 toolchain once.

DONT_RUN=1 make px4_sitl_default gazebo

We use roslaunch to launch every ROS nodes necessary.

roslaunch px4 mavros_posix_sitl.launch

This command launch the launch file located in the PX4 ROS package in Firmware/launch/mavros_posix_sitl.launch. It launches PX4 as SITL, MAVROS, Gazebo connected to PX4, and spawns the UAV.

It is possible to send some arguments to change the vehicle initial pose, the world or the vehicle type.

roslaunch px4 mavros_posix_sitl.launch x:=10 y:=10 world:=$HOME/Firmware/Tools/sitl_gazebo/worlds/warehouse.world

The maps are stored in the PX4 toolchain in Firmware/Tools/sitl_gazebo/worlds/. The Gazebo models and UAVs used in the simulation are in Firmware/Tools/sitl_gazebo/models/.

Setting up the octomap

To use the octomap generated by all the sensors, you need to install the octomap_serveur node to be used in ROS.

sudo apt install ros-melodic-octomap ros-melodic-octomap-mapping
rosdep install octomap_mapping
rosmake octomap_mapping

Then, in order to use the octomap data in the autopilot, you need the python wrapper of the C++ octomap library. You can install it with pip2, but I had conflicts between python 2 and 3, so I compiled the module directly.

git clone --recursive https://github.com/wkentaro/octomap-python.git
cd octomap-python
python2 setup.py build
python2 setup.py install

If you are using the ground truth octomap of the Gazebo world, you need to generate the .bt file representation of the octomap. You can generate it using the world2oct.sh script in the autopilot/world_to_octomap/ folder of this repo.

For more info, refer to autopilot/world_to_octomap/readme.md.

Trajectory tracking

To efficiently track a generated trajectory, we use mavros_controllers. It launches a geometric_controller ROS node, which reads a TwistStamped message in the reference/setpoint topic. Then, it communicates with the PX4 firmware to execute the tracking. We send the points of the motion primitives to this topic in real time.

To install the module, follow the instructions in the repo.

SLAM

For mapping and localization, we are using OpenVSLAM (or the community version).

Instead of the official version, we are using a modified custom version for ROS.

First, you need to install OpenVSLAM (community version), using OpenCV 3.x.x and maybe PangolinViewer. You can simply follow the script on that page to compile all the dependancies. Then, you need to install the ROS package.

You may need to download a DBOW dictionnary. One is provided by OpenVSLAM on Google Drive or Baidu Drive (Pass: zb6v). This will give you the orb_vocab.dbow file needed later.

You may also need to calibrate your camera. OpenVSLAM requires a config.yaml file to calibrate the camera. This page provides a tutorial to calibrate a camera and output a Yaml and a Txt files. But the Yaml file is not the right format for OpenVSLAM. To convert the Yaml format, you have to do it by hand. You can use this and this. We provide a config file for the Bebop in ./bebop/config.yaml.

Add the bash sourcing file of OpenVSLAM in your .bashrc.

source $HOME/openvslam/ros/devel/setup.bash

The pose and odometry data are published in /openvslam/camera_pose and /openvslam/odometry, as geometry_msgs/PoseStamped and nav_msgs/Odometry.

SLAM mapping with an Android phone

To create a map of the environment, you may want to map it using an Android phone. To do that use the app IP Webcam, and get the IP address of the phone. The video feed should be accessible at http://192.168.x.x:8080/video.

Then, clone the ip_camera ROS package, a small python utility which publish an IP camera feed to the /camera/image_raw ROS topic.

git clone https://github.com/ravich2-7183/ip_camera
cd ip_camera
python2 nodes/ip_camera.py -u http://192.168.x.x:8080/video

You can then start the OpenVSLAM node which can analyses the raw video feed. You may need to use to transport the video to a different topic. Follow the tutorial to learn how.

rosrun openvslam run_slam -v /home/rhidra/orb_vocab/orb_vocab.dbow2 -c aist_entrance_hall_1/config.yaml

Bebop setup

To run the program on actual UAVs, we are using the Parrot Bebop 2. To bypass the closed source firmware, so we are using a ROS driver, bebop_autonomy.

To install the driver, follow the official tutorial.

During the installation we had this issue. To solve it, make sure you have the folder catkin_ws/devel/lib/parrot_arsdk. If not, install the parrot_arsdk ROS module. Then copy the module in the catkin workspace.

sudo apt install ros-melodic-parrot-arsdk
cp -r /opt/ros/melodic/lib/parrot_arsdk ~/catkin_ws/devel/lib/

Once you have the module installed, add this line to your .bashrc file.

export LD_LIBRARY_PATH=~/catkin_ws/devel/lib/parrot_arsdk:$LD_LIBRARY_PATH

To process the Bebop video stream, we have a custom module image_proc which add some brightness. You need to install a few dependencies.

sudo apt install python-cv-bridge

To connect to the Bebop you need to connect to the WiFi access point deployed by the UAV. To still be able to access the internet while being connected, you can connect to a wired internet connection, reroute all 192.168.42.1 (drone IP address) traffic to the WiFi interface and the rest of the traffic to the wired interface. To do that, get your WiFi interface name with ip route list, and update your routing table. You have to run that script everytime you connect to the WiFi access point.

sudo ip route add 192.168.42.1/32 dev wlp3s0

Setup the autopilot

Clone this repository in catkin_ws/src/. You also have to compile the C++ local planner. You can use:

# Compile all ROS modules
catkin build

# Only compile this module
cd src/autopilot
catkin build --this

Use a Bebop UAV

To use a Bebop as the autonomous UAV, you first need to make a map of the environment. You can just start the drone without taking off, and move it around to analyse the environment. To start the mapping module, run roslaunch autopilot mapping.launch. It launches the Bebop driver, the OpenVSLAM mapping module and the image processing bridge module. To improve the image quality, you can modify the brightness and contrast of the image processing module. Once done, the OpenVSLAM creates a map database, located be default at autopilot/map-db.msg.

You can now run the autopilot in fly mode, with roslaunch autopilot fly.launch. It launches the Bebop driver, the OpenVSLAM localization module, the image processing bridge module, the Rviz visualization tool and the octomap server.

Launch simulation

To launch all necessary nodes, a launch file is available in autopilot/launch/simulation.launch. It launches ROS, MAVROS, PX4, Gazebo, the Octomap server using a .bt file, the MAVROS trajectory tracker and the autopilot. It also start Rviz with the ./config.rviz configuration file, to visualize the octomap and the algorithms. The /autopilot/viz/global and /autopilot/viz/local topics are used by the autopilot to display data on Rviz. You can also specify a vehicle, and a starting position.

roslaunch autopilot simulation.launch vehicle:=iris world:=test_zone

You should also provide a starting position (x, y, z) and a goal position (goal_x, goal_y, goal_z).

roslaunch autopilot simulation.launch x:=0 y:=0 goal_x:=5 goal_y:=-7.5

The autopilot launched is made of a global planner, a local planner, a trajectory sampler, and some various utility nodes, like a small rostopic publisher to initialize the trajectory tracker, an offset_poses node to pre-process some data for Rviz and a logger node used in case of monitoring for testing.

Autopilot

Global planner

To launch the global planner, launch the simulation (previous section), and run the global planner ROS node with the following command.

The global planner are in the planning directory. It also includes a replanning incremental version of Phi* and Theta*. To use the replanner incremental version, run incr_global_planner.py, to use the normal version, run global_planner.py.

With the normal version, you can specify a few different parameters, like if the ROS node should keep running in the background (-l), the algorithm used (-p), if a debug display should be used (-d), or to record data for later analysis. You can use the --help option to learn more.

rosrun autopilot global_planner.py -p <planning_algorithm> -l

The normal global planner computes a global path only once, then broadcast continuously a potential local goal position to use as an objective for the local planner. The local goal is broadcasted as a PoseStamped on the /autopilot/local_goal topic.

It also deploys a service /autopilot/local_goal for specific request of a local goal at a different UAV position. The message type is defined in srv/LocalGoal.srv.

To use the incremental global planner:

rosrun autopilot incr_global_planner.py

The incremental global planner compute the initial path, then passively broadcast the local goal using the /autopilot/local_goal service. When receiving a map update on /autopilot/octomap_update, it updates the octomap and trigger a global planner incremental update. The broadcast of the local_goal is not interrupted, which allow the UAV to continue moving.

Local planner

We implemented two local planners, a Python planner and a C++ planner. Both are equivalent and should ideally lead to the same result. The C++ module should be used for better performances. After compilation, it is reachable with the local_planner executable file. The Python version is accesible with the local_planner.py file. Both executable receive no arguments.

To launch the local planner, launch the simulation and the global planner. Then run the local planner ROS node with the following command.

rosrun autopilot local_planner

The local planner listens to messages on /autopilot/trajectory/request (controller_msgs/FlatTarget), then computes a potential trajectory and sends it to /autopilot/trajectory/response (msg defined in msg/MotionPrimitive).

Trajectory sampler

To sample the generated trajectory and send it to the MAVROS trajectory tracker, we are using a custom node.

rosrun autopilot sampler.py

The node is writing at a high frequency UAV poses in /reference/flatsetpoint and /reference/yaw (respectively controller_msgs/FlatTarget and std_msgs/Float32). Those are read by the mavros_controller geometric_controller node.

About

A UAV autonomous navigation autopilot, made with ROS, MAVROS, PX4 and Gazebo. Check out my master thesis in the repo for more info.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published