TODO: Logo
ISAACS is an undergraduate research group within the Center for Augmented Cognition of the VHL Vive Center for Enhanced Reality at the University of California, Berkeley. Our research is in human-UAV interaction, with a focus on teleoperation, telesensing, multi-agent interaction, and the intuitive visualization of localization data. We are also part of the student group Extended Reality at Berkeley, and have recently began a collaboration with the Lawrence Berkeley National Laboratory to perform 3D reconstruction of the environment via state-of-the-art methods in radiation detection. This repository contains the system interface. If you are looking for the RadViz system, you want to visit this page.
Led by a small team of passionate students, the project began in 2015 thanks to a Microsoft Hololens Academic Research Grant. The prototype display, aimed at the Bitcraze Crazyflie 1.0 was a success, and eventually evolved into a system that allowed the manipulation of two such UAVs. A year later, the display was ported over to the DJI Matrice 100, and is now available in its second working version, for the Matrice 210, and soon for the Matrice 600. Our decision to use the Matrice 210 and Matrice 600 quadrotors stems from their ability to support a greater range of sensors and mission-critical tasks, which the previously tested UAVs did not.
The current interface enables direct manipulation of the Matrice 210 using natural hand motion, and provides accurate localization and visualization of the UAV's position and environment (including nearby buildings), using GPS. Over the next few months, we will be integrating Radiation, Depth Camera and LIDAR sensors to support, among other things, 3D reconstruction and real-time mapping of the UAV's environment.
TODO: Video and/or Picture
- Hardware Dependencies and Setup
- Software Dependencies and Setup
- Installation and Deployment
- Usage
- Understanding the System
- Meet the Team
- Acknowledgments
- Licensing
You will need the following to run the system in simulation:
- DJI Matrice 210 Quadrotor
- DJI Matrice 210 RTK
- 2x Matrice 210 Quadrotor Batteries
- 1x Matrice 210 RTK Battery
- DJI Matrice Manifold 2 Onboard Computer
- 1x USB 3.0 to TTL Cable
- 1x USB 3.0 to USB 3.0 Cable
- Oculus Rift Virtual Reality Headset
- VR-Ready Computer (we suggest a GeForce GTX 970 Graphics Card or better)
- An Ethernet Cable
Additionally, to fly the UAV in real space, you will need:
- DJI Matrice RTK GPS Station
- 1x USB 3.0 Wi-Fi Card
- 1x Matrice 210 Quadrotor Battery
- A Wi-Fi Source
You will have to connect the Manifold USB 3.0 port to the Matrice 210 UART port, using the USB 3.0 to TTL Cable. Refer to this page for more information. Unlike what is described in the DJI documentation, we found out than on our Matrice 210, the TX and RX pins where inverted, meaning that TX is the white pin, RX is the green pin, and GND is the black pin. You also want to make sure that the gray Power slider is slided all the way to the left.
You will moreover need to plug-in, into the Manifold, a USB 3.0 Wi-Fi card (if you plan on flying the UAV in real space), or an Ethernet cable (if you only plan on running the system in simulation). Also, to facilitate the next steps, you may want to connect the manifold to a keyboard and a screen, using an HDMI cable. If not, you can always SSH into it.
Once you have done the above, place two batteries in the UAV and plug-in the Manifold power cord. Then, double-press and hold the orthogonal white button in front of the Matrice 210 UAV, and finally press and hold the PWR button of the Manifold. If everything went well, the UAV will play a sound, and the Manifold computer will boot.
The system uses two computers, one attached to the UAV, which we call Manifold, and one running the VR interface, which we call VR-Ready Computer. You may also use a third computer to run a flight simulation using the DJI Assistant 2 for Matrice, but this can be done on the VR-Ready Computer simultaneously as the frontend application is running. The Manifold backend depends on ROS Kinetic, which requires Ubuntu 16.04 (Xenial), or another Debian-based GNU/Linux distribution. You will furthermore need the ROS DJI SDK, and a Rosbridge Server. The frontend interface depends on Unity 2018.4, and can be run on any platform, but has only been tested on Windows 10.
Although the manifold comes with most things you need installed by default, you will have to setup a ROS Workspace and the Rosbridge Server. Refer to this page for more information on how to setup a ROS Workspace.
`catkin make` does not compile
You might need to clone the nmea_msgs package into the src
folder, and then try again.
I'm editing the sdk.launch file with `rosed`, but I cannot find the correct serial port
This will in most cases be /dev/ttyUSB0
. If this is incorrect, then an error will pop up. To find the correct serial port:
$
grep -iP PRODUCT= /sys/bus/usb-serial/devices/ttyUSB0/../uevent
CAUTION: there is a space between PRODUCT= and /sys'. This is not a typo.$
lsusb | grep <ID>
Replace <ID> with the ID found from the previous step.
I don't know what to set the Baudrate to
The Baudrate should be set to 921600. If you are using the DJI Assistant 2 for Matrice to simulate a flight, then you also need to set the same Baudrate inside the DJI Assistant 2 for Matrice app, which can be found under the SDK tab.
Connecting to the simulator and launching the SDK fails for an unknown reason
This can be due to many reasons, but generally it means tht you have to set a udev exception, and/or disable advanced sensing and connect the Manifold with the UAV with an additional USB 3.0 to USB 3.0 cable. CAUTION: disabling advanced sensing disables the Matrice 210's built-in object avoidance mechanism.
$
echo 'SUBSYSTEM=="usb", ATTRS{idVendor}=="2ca3", MODE="0666"' | sudo tee /etc/udev/rules.d/m210.rules
- Change
enable_adanced_sensing
tofalse
in the fileDJI/catkin_ws/Onboard-SDK-ROS/dji_sdk/src/modules/dji_sdk_node.cpp
$
sudo apt-get install ros-kinetic-rosbridge-server
Unity versions and installation instructions can be found on this page.
Make sure that you red and went through the Hardware Dependencies and Software Dependencies section, before proceeding with the system installation. This is critically important; the system will not work otherwise.
- Clone the project on the VR-Ready Computer with the following command:
$
git clone https://github.com/immersive-command-system/ImmersiveDroneInterface_2.git
- Initialize Submodules:
git submodule update --init --recursive
- Place the RTK Battery inside the RTK Controller, and turn it on.
- Disable RTK Signal (you may need to connect the controller to a phone or tablet with the 'DJI Go 4' app for this step)
- Modify the Manifold's .bashrc to source ROS environment variables:
$
echo 'cd $HOME/DJI/catkin_ws && source devel/setup.bash' >> $HOME/.bashrc
- In a new terminal, start the DJI SDK:
$
roslaunch dji_sdk sdk.launch
- Test if the UAV can receive Manifold instructions by running the following command (this should spin the rotors, without actually flying the drone):
$
rosservice call /dji_sdk/sdk_control_authority 1
$
rosservice call /dji_sdk/drone_arm_control 1
- If the rotor spin, great, we are almost there! Stop the rotors with the following command:
$
rosservice call /dji_sdk/drone_arm_control 0
- Check that the Manifold is correctly connected to the Ethernet cable. Connect the other end of the Ethernet cable to the VR-Ready computer.
- Run the Rosbridge Server. This will launch a WebSocket in port 9090. If you want to use a different port, see this page.
$
roslaunch rosbridge_server rosbridge_websocket.launch
- Connect the Oculus headset with the VR-Ready laptop. If you have not done so already, follow through the Oculus Rift setup.
- Connect the Manifold to a computer with the DJI Assistant 2 for Matrice using a USB 3.0 to USB 3.0 cable, and launch the Flight Simulator.
- Launch our application via Unity. Find the script named
ROSDroneConnection.cs
TODO: Is this the correct script? and replace the IP address of the server with the actual IP address of the Manifold. To find the IP address of the Manifold, use the following command:
$
hostname -I
- Save and close the script, and launch our application by clicking on the play (small triangle) button inside Unity. If all went well, you should see printed information that a client connected to the Rosbridge Server, inside the terminal from which the Rosbridge server was launched.
- Congratulations, you are ready to fly your UAV in VR!
Follow the steps 1-11 as above, skipping step 12. Then, setup the RTK GPS Station (TODO).
Finally, continue with steps 13-15.
Each time that you want to run our system, you will have to first disable the RTK signal, and then run the DJI SDK and Rosbridge Server. The routine is rather simple:
- Power-on the UAV, Manifold and VR-Ready Computer
- (Optionally) connect the UAV to the DJI Assistant 2 for Matrice, and launch the Flight Simulator
- Turn of the RTK signal through the 'DJI Go 4' app
- Launch the SDK
$
roslaunch dji_sdk sdk.launch
- Launch the Rosbridge Server
$
roslaunch rosbridge_server rosbridge_websocket.launch
- Open our system in Unity and click the play button
Moreover, each time your internet connection changes, you will have to change the IP address that the Unity client subscribes to.
TODO: add video guide
(how to manipulate the drone when wearing the VR headset, how to scroll, zoom, rotate...)
TODO and also add a picture that shows our architecture
(Upstream development happens on the Alpha branch. Once most bugs have been eliminated, changes are pushed on the Beta branch for testing. The Master branch gets updated only when the interface is demo-ready.)
TODO: Add pictures
Peru Dayani, Research Lead
Nitzan Orr, Product Manager
Apollo, Interaction Designer
Eric Wang, Data Visualization Engineer
Varun Saran, Network and Streaming Data Engineer
Shreyas Krishnaswamy, Localization Engineer
Newman Hu, Controls Engineer
Rithvik Chuppala, System Administrator
Arya Anand, 3D Artist
Jesse Patterson
Jessica Lee
Ji Han
Paxtan Laker
Rishi Upadhyay
Brian Wu
Eric Zhang
Xin Chen
We would like to thank Dr. Allen Yang and Dr. Kai Vetter for their mentorship and supervision. We would also like to thank our graduate advisors, David McPherson and Joe Menke for their continuous support.
This repository contains four types of files: program files, assets created by us (such as UAV models), Unity prefabs, and SDK files (DJI and MapBox). All program files, unless otherwise stated, are distributed under the GNU General Public License version 3. All media files are distributed under the Creative Commons Attribution-ShareAlike 4.0 International license. For Unity prefabs and SDK files, please refer to their respective licenses. A license notice is included within all files created by us.
In case of doubt on whether you can use an asset, or on how to correctly attribute its authors, please e-mail us at: [email protected].