This repository contains the pipeline inspection mission example from our paper Mission Planning and Safety Assessment for Pipeline Inspection Using Autonomous Underwater Vehicles: A Framework based on Behavior Trees.
This example repository aims to demonstrate the framework introduced in the paper. A simple example of a squared trajectory is integrated withing the ROS2 packages imcpy_ros_bridge
and remaro_uw_sim
.
imcpy_ros_bridge
: A bridge interface between IMC (Interface Message Control), ROS (Robot Operating System), and the behaviour trees, facilitating control, communication and data exchange in the robot mission.remaro_uw_sim
: packages to bridge between UNavSim, ROS and Dune. It also includes a data recording script.
- LSTS Toolchain components, for controlling the AUV:
- UNavSim: for simulating realistic renderings of underwater environments, and getting sensor recordings such as camera, segmentation and IMU. Installation instructions available here.
- ROS2. These repos have been tested under ROS2 Foxy and Humble.
- py_trees_ros. Installation instructions available here.
imcpy_ros_bridge
and remaro_uw_sim
are ROS2 stacks (or metapackages, if you prefer). For ROS version compatibility we refer to the documentation from the original repos. You can clone this repository in your colcon workspace to compile these ROS stacks as follows:
cd $HOME/<path-to-your-colcon-ws>/src
git clone https://github.com/remaro-network/pipe_inspection_mission
cd ..
colcon build
Here we will demonstrate the usage of the framework with a simple square trajectory example.
The following figure shows, on the left, the ROS2 stacks imcpy_ros_bridge
and remaro_uw_sim
pointing to the files within the packages required to run the proposed example. On the right, the file structure that the dataset recorder scrips generates.
Let's set up an example of UNavSim working with a simulated OMST vehicle in Dune. For that, you will have Dune, Neptus, UNavSim, and this ROS node all working at the same time. In this example I will show you how to run them all one by one.
- Dune
First, cd to your dune/build
directory and run:
./dune -c lauv-simulator-1 -p Simulation
- Neptus Then, from the directory where you cloned Neptus, execute Neptus as follows:
./neptus
In the Neptus interface, connecto to the lauv-simulator-1
vehicle to see its state.
- UNavSim
Then, form UNav-Sim, run the simulation in your favourite environment with the AirSimGameMode
setup.
- ROS
Finally, you can have Neptus and UNavSim running all at once with this ROS package. We have a launcher prepared for you that runs both the imcpy_ros_bridge package (to bridge between Dune ROS) and the bridge between ROS and UNavSim within this package:
ros2 launch neptus_interface lauv_simulator_1.launch.py
You can record your own dataset with the sensor data from UNavSim with the mimir_recorder
script. Important note: this script and the rosbag recorder cannot work simultaneously. You have to choose one or the other.
ros2 run unavsim_ros_pkgs mimir_recorder.py
To move the robot in the environment, you can use behaviour trees. You can try our example as:
ros2 launch imcpy_trees square_trajectory_launch.py
The ROS2 topics published during the mission are:
Package | Publisher node | Topic | Type | Content |
---|---|---|---|---|
unavsim_ros_pkgs |
unavsim_node |
/<camera_name>/Scene |
sensor_msgs/msg/Image |
UNavSim RGB camera |
/<camera_name>/Segmentation |
sensor_msgs/msg/Image |
UNavSim segmentation labels | ||
/<camera_name>/DepthPlanar |
sensor_msgs/msg/Image |
UNavSim depth camera | ||
/<camera_name>/Scene/camera_info |
sensor_msgs/msg/CameraInfo |
UNavSim camera intrinsics | ||
/imu/Imu |
sensor_msgs/msg/Imu |
UNavSim's IMU measurement | ||
/altimeter/barometer |
unavsim_interfaces/msg/Altimeter |
UNavSim altimeter measurements | ||
/tf |
tf2_msgs/msg/TFMessage |
6 DOF pose in UNavSim | ||
imcpy_ros_bridge |
imc2ros |
/from_imc/base_link |
geometry_msgs/msg/PoseStamped |
6 DOF pose in DUNE |
/from_imc/estimated_state |
imc_ros_msgs/msg/EstimatedState |
6 DOF pose estimated by DUNE |
The Image data looks like this:
RGB | Segmentation | Depth |
---|---|---|
This work is part of the Reliable AI for Marine Robotics (REMARO) Project. For more info, please visit: https://remaro.eu/
This project has received funding from the European Union's Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 956200.