Usage¶
The benchmark process involves the following steps:
- Download the IILABS 3D dataset
- Set up the Docker environment
- Run SLAM algorithms on the dataset
- Evaluate the results
Download the IILABS 3D Dataset¶
The IILABS 3D dataset is available at the INESC TEC research data repository. You can download it manually or use the IILABS 3D toolkit for easier access.
Using the IILABS 3D Toolkit¶
The IILABS 3D Toolkit provides utilities for working with the dataset.
Installation¶
pip install iilabs3d-toolkit
Autocompletion
You can install autocompletion for the toolkit by running:
iilabs3d --install-completion
Downloading Sequences¶
To download a specific sequence for a specific sensor:
iilabs3d download <output_directory> <sequence_name> <sensor_name>
For example, to download the loop benchmark sequence for the Livox Mid-360 sensor:
iilabs3d download ~/slam_data loop livox_mid_360
To download all benchmark sequences for all sensors:
iilabs3d download ~/slam_data bench all
Dataset Structure
The sequences will be saved in the following structure:
<output_directory>/iilabs3d-dataset/<sequence_prefix>/<sensor_name>/<sequence_name>/
Dataset Directory
The provided Docker Compose file is configured to mount the ${HOME}/slam_data
directory, allowing containerized access to the dataset files. To ensure proper functionality, save your dataset in this directory. Alternatively, you can modify the docker-compose.yml
file to specify a different path if needed.
Set Up the Docker Environment¶
The benchmark uses Docker to ensure a consistent environment for all SLAM algorithms. This approach guarantees reproducibility and simplifies the setup process.
Prerequisites¶
- Docker installed on your system
- At least 10GB of free disk space
- 16GB RAM recommended for optimal performance
Guide for installing Docker and other tools in Ubuntu 20.04
For more detailed information about the Docker installation and additional setups, please refer to the Install Docker section.
Clone the Repository¶
git clone https://github.com/JorgeDFR/3d_lidar_slam_benchmark_at_iilab.git
cd 3d_lidar_slam_benchmark_at_iilab/docker
Build and Run Docker Containers¶
The benchmark uses two Docker images. You can either build the images locally from the provided Dockerfiles (slower, compiles libraries/packages) or pull prebuilt images from Docker Hub (faster).
Contains the following SLAM algorithms:
- A-LOAM
- LeGO-LOAM-BOR
- LIORF
- DLIO
Option 1: Pull Prebuilt Image¶
docker pull jorgedfr/3d_slam_ros1:noetic
Option 2: Build Image Locally¶
docker compose build ros1_noetic
Start Docker Container¶
docker compose up ros1_noetic -d
Access Docker Container¶
docker exec -it 3d_slam_ros1 bash
Contains the following SLAM algorithms:
- VineSLAM
- KISS-ICP
- GLIM
- Kinematic-ICP
- MOLA-LO
Option 1: Pull Prebuilt Image¶
docker pull jorgedfr/3d_slam_ros2:humble
Option 2: Build Image Locally¶
docker compose build ros2_humble
Start Docker Container¶
docker compose up ros2_humble -d
Access Docker Container¶
docker exec -it 3d_slam_ros2 bash
GUI Applications
Before running RViz inside the Docker container, you need to set up xhost
:
./setup_xhost.sh
Run SLAM Algorithms on the Dataset¶
In one terminal, set the enviroment variables to select the algorithm and 3D LiDAR sensor, and start the SLAM algorithm:
# Set environment variables
export SLAM_CONF=<algorithm_name>
# Options: aloam, lego_loam_bor, liorf, dlio
export SLAM_SENSOR=<sensor_name>
# Options: velodyne_vlp_16, ouster_os1_64, robosense_rs_helios_5515, livox_mid_360
# Start the SLAM algorithm
roslaunch slam_benchmark_ros1_conf slam_benchmark.launch
In another terminal, play the rosbag of the desired sequence:
rosbag play <rosbag_file_path>
In one terminal, set the enviroment variables to select the algorithm and 3D LiDAR sensor, and start the SLAM algorithm:
# Set environment variables
export SLAM_CONF=<algorithm_name>
# Options: vineslam, kiss_icp, glim, kinematic_icp, mola_lo
export SLAM_SENSOR=<sensor_name>
# Options: velodyne_vlp_16, ouster_os1_64, robosense_rs_helios_5515, livox_mid_360
# Start the SLAM algorithm
ros2 launch slam_benchmark_ros2_conf slam_benchmark.launch.xml
ros2 bag play <rosbag_file_path>
VineSLAM Special Case
VineSLAM requires recompilation of its ROS 2 package whenever the LiDAR sensor is changed. Use the -DLIDAR_TYPE
CMake argument to specify your sensor:
-DLIDAR_TYPE=0
(default) for Velodyne VLP-16-DLIDAR_TYPE=1
for RoboSense RS-Helios-5515-DLIDAR_TYPE=3
for Ouster-OS1-64-DLIDAR_TYPE=4
for Livox Mid-360
cd ros2_ws
colcon build --packages-select vineslam_ros --cmake-args -DLIDAR_TYPE=0
Offline Processing Mode¶
Alternatively, several SLAM algorithms support an offline mode that processes rosbag files faster than real-time without losing messages:
roslaunch slam_benchmark_ros1_conf slam_benchmark.launch run_offline:=true rosbag_path:=<rosbag_file_path>
ros2 launch slam_benchmark_ros2_conf slam_benchmark.launch.xml run_offline:=true rosbag_path:=<rosbag_file_path>
Supported Algorithms for Offline Mode
The following algorithms currently support offline processing:
- LeGO-LOAM-BOR
- GLIM
- Kinematic-ICP
- MOLA-LO
Record and Evaluate Results¶
Recording Odometry Trajectories¶
To record the odometry trajectory generated by the SLAM algorithms, run the following command in another terminal:
rosbag record -O <output_bag_file_name> /slam_odom
ros2 bag record -o <output_bag_file_name> /slam_odom
MOLA-LO Special Case
For MOLA-LO, set an environment variable to use the LiDAR frame instead of the default base_link
frame. This step is necessary because MOLA-LO expects the robot frame to be base_link
, while the IILABS 3D dataset uses eve/base_link
. Then use the mola-lidar-odometry-cli
command. For more informations regarding this tool please refer to the MOLA main documentation.
# Set environment variable to use the LiDAR frame instead of base_link
export MOLA_USE_FIXED_LIDAR_POSE=true
mola-lidar-odometry-cli \
-c $(ros2 pkg prefix mola_lidar_odometry)/share/mola_lidar_odometry/pipelines/lidar3d-default.yaml \
--input-rosbag2 <path_to_bag_file> \
--lidar-sensor-label <lidar_topic> \
--output-tum-path <output_file_path>
<path_to_bag_file>
: Path to your input rosbag file<lidar_topic>
: LiDAR sensor topic name (/eve/ouster/points
for Ouster sequences or/eve/lidar3d
for other sensors)<output_file_path>
: Desired path for the TUM-formatted trajectory output
Converting to TUM Format¶
Now, to convert the odometry trajectory to TUM format, you can use the evo toolkit, which is automatically installed along with the IILABS 3D toolkit:
evo_traj bag <rosbag_file_path> slam_odom --save_as_tum
evo_traj bag2 <rosbag_file_path> slam_odom --save_as_tum
Evaluating Trajectories¶
Use the IILABS 3D toolkit to evaluate the trajectory against the ground truth:
iilabs3d eval <ground_truth.tum> <odometry.tum>
This will calculate metrics including Absolute Trajectory Error (ATE), Relative Translational Error (RTE), and Relative Rotational Error (RRE).