⌛ Expected completion time: 25-35 minutes
If you just want to run the completed project, this section can help you get up and running quickly. Here, we provide a pre-trained pose estimation model for you to use, and assume a Docker workflow. By the end of this quick demo, you will be able to perform pick & place in Unity with machine learning-based perception. To learn how to build something like this from scratch, see our full tutorial.
Table of Contents
- Prerequisites
- Add the Pose Estimation Model
- Set Up the ROS Side
- Set Up the Unity Side
- Put It All Together
You will first need to clone this repository.
- Open a terminal and navigate to the folder where you want to host the repository.
git clone --recurse-submodules https://github.com/Unity-Technologies/Robotics-Object-Pose-Estimation.git
-
Open the completed project. In the Unity Hub, click the
Add
button, and selectRobotics-Object-Pose-Estimation/PoseEstimationDemoProject
from inside the file location where you cloned the repo. -
Open the scene. Go to
Assets/Scenes
and double click onTutorialPoseEstimation
. -
We now need to set the size of the images used. In the Game view, click on the dropdown menu in front of
Display 1
. Then, click + to create a new preset. Make sureType
is set toFixed Resolution
. SetWidth
to650
andHeight
to400
. The gif below depicts these actions.
In your root Robotics-Object-Pose-Estimation
folder, you should have a ROS
folder. Inside that folder you should have a src
folder and inside that one 5 folders: moveit_msgs
, robotiq
, ros_tcp_endpoint
, universal_robot
and ur3_moveit
.
-
Download the pose estimation model we have trained.
-
Go inside the
ROS/src/ur3_moveit
folder and create a foldermodels
. Copy theUR3_single_cube_model.tar
file you just downloaded into this folder.
Note: This project has been developed with Python 3 and ROS Noetic.
We have provided a Docker container to get you up and running quickly.
- Install the Docker Engine if not already installed. Start the Docker daemon. To check if the Docker daemon is running, when you open you Docker application you should see something similar to the following (green dot on the bottom left corner with the word running at the foot of Docker):
- In the terminal, ensure the current location is at the root of the
Robotics-Object-Pose-Estimation
directory. Build the provided ROS Docker image as follows:
docker build -t unity-robotics:pose-estimation -f docker/Dockerfile .
Note: The provided Dockerfile uses the ROS Noetic base Image. Building the image will install the necessary packages as well as copy the provided ROS packages and submodules to the container, predownload and cache the VGG16 model, and build the catkin workspace.
- Start the newly built Docker container:
docker run -it --rm -p 10000:10000 -p 5005:5005 unity-robotics:pose-estimation /bin/bash
When this is complete, it will print: Successfully tagged unity-robotics:pose-estimation
. This console should open into a bash shell at the ROS workspace root, e.g. root@8d88ed579657:/catkin_ws#
.
Note: If you encounter issues with Docker, check the Troubleshooting Guide for potential solutions.
- Source your ROS workspace:
source devel/setup.bash
The ROS workspace is now ready to accept commands!
-
At the top of your screen, open the ROS settings by selecting
Robotics/ROS Settings
. FillROS IP Address
andOverride Unity IP
with the loopback IP address127.0.0.1
. -
Ensure that
ROS Port
is set to10000
andUnity Port
is set to5005
.
Run the following roslaunch
command in order to start roscore, set the ROS parameters, start the server endpoint, start the Mover Service and Pose Estimation nodes, and launch MoveIt.
- In the terminal window of your ROS workspace opened above, run the provided launch file:
roslaunch ur3_moveit pose_est.launch
This launch file also loads all relevant files and starts ROS nodes required for trajectory planning for the UR3 robot. The launch files for this project are available in the package's launch directory, i.e. src/ur3_moveit/launch
.
This launch will print various messages to the console, including the set parameters and the nodes launched. The final message should confirm You can start planning now!
.
Note: The launch file may throw errors regarding
[controller_spawner-5] process has died
. These are safe to ignore as long as the final message isReady to plan
. This confirmation may take up to a minute to appear.
- Return to Unity, and press Play.
Note: If you encounter connection errors such as a
SocketException
or don't see a completed TCP handshake between ROS and Unity in the console window, return to the Set up the Unity side section above to update the ROS Settings and generate the ROSConnectionPrefab.
Note that the robot arm must be in its default position, i.e. standing upright, to perform Pose Estimation. This is done by simply clicking the Reset Robot Position
button after each run.
- Press the
Pose Estimation
button to send the image to ROS.
This will grab the current camera view, generate a sensor_msgs/Image message, and send a new Pose Estimation Service Response to the ROS node running pose_estimation_service.py
. This will run the trained model and return a Pose Estimation Service Response containing an estimated pose, which is subsequently converted and sent as a new Mover Service Response to the mover.py
ROS node. Finally, MoveIt calculates and returns a list of trajectories to Unity, and the poses are executed to pick up and place the cube.
The target object and goal object can be moved around during runtime for different trajectory calculations, or the target can be randomized using the Randomize Cube
button.
Note: You may encounter a
UserWarning: CUDA initialization: Found no NVIDIA driver on your system.
error upon the first image prediction attempt. This warning can be safely ignored.
Note: If you encounter issues with the connection between Unity and ROS, check the Troubleshooting Guide for potential solutions.
You should see the following:
Congrats! You did it!
If you'd now like to follow the full tutorial to learn how to build the pick-and-place simulation from scratch, proceed to Part 1