Fourth assignment for ME 495: Embedded Systems in Robotics, Northwestern University, Fall 2020.
This package contains code for various mapping and navigation tasks on the TurtleBot3 Burger, such as manual navigation SLAM for map generation, localization and navigation from a pre-loaded map, SLAM and path planning using a pose goal, and autonomous frontier-based exploration.
- turtlebot3
- slam_toolbox
start_slam.launch
: used to map the environment manually by controlling the turtlebot using arrow keys through theturtlebot3_teleop_key
node.nav_stack.launch
: used to navigate a pre-mapped environment using path planning and pose goals.slam_stack.launch
: used to simultaneously map the environment and navigate using path planning and pose goals.exploration.launch
: used to conduct autonomous exploration of an environment using a frontier-based approach and SLAM.exploration
node: node that carries out frontier exploration algorithm.
Code can be run in Gazebo instead of in real life. To do this, add the option launch_gazebo:=true
to any of the launchfiles.
-
Create a new workspace and clone the demonstration code.
# Create a new workspace mkdir -p ws/src # clone the demonstration code cd ws/src git clone https://github.com/ME495-EmbeddedSystems/homework-3-ctsaitsao.git homework4 # return to ws root cd ../
-
Build the workspace and activate it.
catkin_make install . devel/setup.bash
-
Run the launchfile of choice, for example:
roslaunch homework4 start_slam.launch
-
SSH into the turtlebot and start its base communication.
ssh [email protected] roslaunch turtlebot3_bringup turtlebot3_robot.launch
I experienced some time synchronization issues when running the code on a real-life turtlebot. The time on the turtlebot and the time on my computer were not the same. A couple fixes for this are:
- When SSH'd into the turtlebot, run:
sudo date -s @`(date -u +"%s" )`
- If the above fix doesn't work, connect turtlebot to router through ethernet cable and run:
Reboot and check that timedatectl returns
timedatectl set-ntp true
System clock synchronized: yes
. - If even that doesn't work, then the TurtleBot and laptop need to be connected to the same wifi. Do this by adding wifi information to the
/etc/netplan/10-network.yaml
file in the TurtleBot's Raspberry Pi. If after this pingingturtlebot.local
doesn't work, connect the RasPi to a screen and keyboard and use theip addr
command to acquire the TurtleBot's local IP address. Then, replaceturtlebot.local
with the IP address when SSH'ing (ssh ubuntu@[IP address]
) and in the ROS_HOSTNAME environment variable in the TurtleBot's.bashrc
file.
start_slam.launch
nav_stack.launch
slam_stack.launch
exploration.launch
(real life)exploration.launch
(Gazebo)
- Get map data from
/map
topic. This data contains a row-major 1D list of map values fromslam_toolbox
, where a value of 100 corresponds to an occupied cell, a value of 0 corresponds to an unoccupied cell, and a value of -1 corresponds to an unknown cell. We are trying to find the "frontier" cells that lie in between unoccupied and unknown cells. - Transform 1D list of map values into a 2D np.ndarray.
- From
/map
, get pose of lower left corner of map in terms ofmap
frame (a standard frame of the ROS navigation stack). Broadcast this pose as a transform frommap
tolower_left_corner
usingtf2_ros
. - Get image gradient of map array. I used my own gradient function that uses a Roberts Cross convolutional kernel but any gradient/edge detection algorithm works, such as Sobel and Canny.
- Get global costmap data from
/move_base/global_costmap/costmap
topic. - Transform 1D list of global costmap values into a 2D np.ndarray.
- From image gradient, find the cell positions where the gradient is above some threshold (in this case 150) and the global costmap is below some threshold (in this case 50). This is basically selecting frontier cells that are not near obstacles or walls.
- Pick one of the frontier cells at random.
- Use
tf2_ros
to send transform of of the position of this frontier cell (in a newgoal
frame) in thelower_left_corner
frame, accounting for meters/cell resolution. - From TF server, acquire transform from
map
togoal
. - Send this transform's x and y positions to
move_base
and wait for robot to complete its trajectory. - Loop back to first step.