Welcome to the Eastern Edge Robotics Software 2024. For a production installation guide, navigate to the Docker Installation section (recommended) or Host-Side Installation section. For a Simulation Environment install guide, navigate to the Simulation Section.
The Software 2024 frontend application, located in the GUI folder, is built using the React Javascript Framework. The GUI folder contains a NodeJS (Node Javascript) workspace featuring, most notably, a package.json file which contains all the dependencies for running the development environment and generating a build/ folder for production.
To run the Frontend as a development environment (non intended for production but functionally equivalent), install NodeJS. Then, navigate to the GUI directory and type the following two commands:
npm install
npm start
Note that npm install
will take up a few Gigabytes as all dependencies are downloaded. However, all downloaded files will appear in a folder called node_modules in the GUI directory, so it is easy to track and later delete to free up space. The large install is the only disadvantage with running the frontend using the development environment.
The ROS2 (Robotics Operating System 2) workspace, located in the ROS2 folder, contains the "Backend" of the software package. It handles things such as i2c communications to the various board components, thruster math, profiles database, autonomus mode, simulation environment, and more. It is meant to run on Beaumont's onboard Raspberry Pi 4.
The workspace uses the ROS2 Humble distribution. The development and production installations for the backend are the same.
In production, camera feed from the main enclosure Raspberry Pi 4 as well as the 3 mini-enclosure Raspberry Pi Zero 2Ws is streamed over TCP/IP using MJPEG (motion jpeg). This is done using a Github project called Spyglass. To install the streamer on a Raspberry Pi, copy over the /Cameras directory and run the install.sh script inside.
. install.sh
This will configure everything and turn the camera streamer into a daemon service running in the background. To access the camera stream, the following can be typed in a browser:
http://<Raspberry Pi ip>:<Port specified in the spyglass.conf file (default 8080)>/stream
The simulator is not part of the production environment. Is is meant to completely replace the physical ROV while retaining all elements of the frontend and the majority of elements from the backend (minus i2c board communications for the actual ROV). Navigate to the Simulation section below to learn more.
This guide will allow you to run the backend and frontend application on a Raspberry Pi or any computer running Debian or Ubuntu (as well as likely other Linux flavours).
If you are intending to run docker on a Raspberry Pi running Raspberry Pi OS (tested with Bookworm), copy over the Software 2024 directory (can use scp over the network). Navigate to the Software 2024 directory on the Raspberry Pi and type the following into the terminal.
. all_in_one_pi4_installer.sh
Installation is done, including backend, frontend, and the camera streamer. The Raspberry Pi, should now be ready to drive the bot provided access to an i2c bus with the correct components.
Software 2024 Docker Installation for Windows, Mac, or other desktop OS (Recommended for Development)
Ensure the docker daemon is running. This may require having docker desktop running on Windows.
To check if the daemon is running, type the following into a terminal:
docker
This should return a bunch of possible commands assoicated with docker, indicating docker is running.
Navigate to Software_2024 (main directory) in the terminal.
Modify the compose.yaml file in the Software_2024 based on what you need. Do not proceed to step 4 without this because the compose.yaml is intended for Raspberry Pi. The file is commented to help with this.
compose.yaml files are a popular way of defining docker applications because, once they are configured to the user's liking, they only require one command to setup the whole environment.
Run the following in a terminal window:
docker compose up
You can use -d
to make running the containers a background process (you will regain access to your terminal once the container is done loading). You can also add --build backend
if you made modifications to the code and therefore have to recompile the backend.
If you would like a rundown of how to use docker including the definition of containers and images, how docker compose files and Dockerfiles work, how to manage containers, images, volumes, and more, look at the following:
- This really good and comprehensive guide
- Tutorials provided by Docker Desktop upon install
This guide will allow you to install the entire software package to production-level without docker. This tutorial is intended for Ubuntu Linux 22.04, but may be modified for other Operating Systems. This is generally not required.
Ensure you have the Software 2024 repository on your machine.
Install ROS2 humble (base or complete)
Navigate to the colcon_ws directory build the backend packages, sourcing the main ROS2 workspace first if this .
cd <Path to the Software 2024 repository on your computer>/ROS2/colcon_ws
source /opt/ros/humble/setup.bash
colcon build
Sourcing the ROS2 workspace allows your terminal instance to know that you have ROS2 installed, and that it is an executable.
After a successful build, you should see three new folders in colcon_ws (build, install, and log).
Make sure to source ROS2 and this ros workspace in every new terminal window you open.
source /opt/ros/humble/setup.bash
source /home/colcon_ws/install/setup.bash
Alternatively, you can copy the above command into the /etc/bash.bashrc file, which is automatically run at every new terminal window.
To allow the running ROS2 instance of this container to communicate with the GUI, install rosbridge.
sudo apt install ros-humble-rosbridge-server
You should now be ready to launch the backend using the following command:
ros2 launch beaumont_pkg beaumont_startup.xml
This will run all required nodes. The backend is now fully running and ready to communicate with frontend without further setup.
To run the frontend, refer to the quick development environment installation guide in the Introduction. There is no need to run a production build of the frontend in any place other than a docker container.
When you run the open the frontend in a browser window, ensure to put in the IP of the machine running the backend in the ROSIP section of the settings tab.
The simulation environment is mainly for use in developing and testing code for any autonomous tasks (science tasks) in the MATE competition. It may also be used for testing features in the GUI or demo run practice.
The Open Source Robotics Foundation (osrf) resposible for writing ROS also created a simulation software called Gazebo. The version used for this simulation is Gazebo Classic (which has an end of life at 2025).
Gazebo allows for creating robot models with visuals, collision, physics, plugins, and sensors.
For visuals and collision, models can be made using the normal Gazebo model editor (which has powerful features such as mating), or they can be imported as STL. An open-source custom Onshape API called Onshape to Robot allows for conveniently downloading Onshape models as STL files. Alternatively, the STL files can be downloaded straight from onshape.
Plugins allow for the bot to listen to and be controlled by traditional ROS topics. Sensors grab information from the simulation environment and publish their data as ROS topics. This means that the ROS2 workspace can have "simulation" versions of the same nodes that control the actual Bot. These simulation nodes can listen to the same topics as the real ones, and the sensors can publish to the same topics as well (only sensors currently implemented are cameras, which actually create an MJPEG stream independent of ROS).
This means that, in the point of view of the frontend and a large part of the backend (SQLite database in profiles_manager.py, autonomus brain coral transplant action node, autonomus coral reef modelling code), the simulation environment is the same as the actual bot.
This guide shows how to setup the installation environment on Ubuntu 22.04. It may work on other distributions based on compatibility with ROS2 Humble and Gazebo Classic.
Follow the Host-Side installation section.
Install Gazebo Classic. Note that the version used for the simulation at the time of writing is Gazebo Classic 11.10.2
Install the ROS2 gazebo_ros_pkgs. No custom plugins were written for this simulation, all plugins used are repurposed from here.
Navigate to the worlds folder of the ROS2 directory in the Software package and open a world
gazebo competition_world
Worlds may open with models already in them. Also, worlds can be edited on the fly then saved as a .world file.
Note that you should source the ROS workspace (as described in the ROS installation guides) in order for the Bot in the simulation to listen to the ROS topics.
Navigate to the insert on the top left and click "Add Path". Navigate to the models folder of the ROS2 directory of the Software package and select it (if it's not already there).
You should now see all of the models made by EER under the insert tab.
Launch the GUI in the browser. Ensure that the ROS IP in the settings tab is set to the IP of the machine running ROS (localhost if it's the same machine).
In another window, ensure that the ROS workspace is sourced as per the step 1 tutorial. Then run the following command:
ros2 launch beaumont_pkg simulation_beaumont_startup.xml
A launch file is simply a shortcut to running each required node individually.
Ensure that there is a green checkmark next to ROS in the BotTab in the GUI. Once it is green, navigate to the settings tab.
For each of the 4 cameras, type "http://:<EITHER 8880, 8881, 8882, or 8883>/cam.mjpg"
This may or may not already be saved and fetchable from the database.
Plug in a controller and either pick or create a profile.
You should now see the simulated camera feed in the CameraTab. Note that the Bot will not move unless the power multipliers are set to non-zero in the BotTab.