Skip to content

Latest commit

 

History

History
86 lines (67 loc) · 3.39 KB

README.md

File metadata and controls

86 lines (67 loc) · 3.39 KB


Semantic Pointcloud Filter

License: GPL v3

[Project page]   [Paper]   [Data]   [Video]

Overview

Official implementation of paper Seeing Through the Grass: Semantic Pointcloud Filter for Support Surface Learning. Here you can find the code for SPF training and the self-supervised label genration.

Installation

Dependencies

create and activate conda environment

conda env create -f environment.yaml
conda activate spf_venv

If you want to visulize the reconstruced support surface created by Gaussian Process, you also need to install the msgpack-c. And add the grid_map packages into your catkin workspace.

Install SPF

cd semantic_front_end_filter
pip install -e .

Getting started

Self-supervised label generation

For the details of how to reconstruct the support surface from the robot feet trajectories, see here.

Dataset structure

You can download our training data from here, which we build on data collected from Perugia, Italy.

This dataset contains six trajectories: three for training and three for evaluation. To visualize a single data point from any trajectory, use the following command:

python semantic_front_end_filter/scripts/dataset_vis.py --data_path <path-to-data-folder>

This will display an image like the one shown below: Overview

  • Left column: Model input for the SPF network.
  • Right column: Ground truth labels needed for training.
    • SSDE Label: Support Surface Depth Estimation (mean and variance)
    • SSSeg Label: Support Surface Semantic Segmentation (Obstacles vs. Support Surface)

Train model

For your own robot, you need to raycast the reconstructed support surface in the camera point of view to get the supervison label for depth estimation.

To starting training, run

python semantic_front_end_filter/scripts/train.py --data_path <path-to-data-folder>

To validate the trained model, run

python semantic_front_end_filter/scripts/eval.py --model <path-to-model-folder> --outdir <path-to-save-the-eveluation-plot> --data_path <path-to-data-folder>

Our trained model can be downloaded here. Please remember to download the whole folder.

Citing this work

@ARTICLE{qiao23spf,
  author={Li, Anqiao and Yang, Chenyu and Frey, Jonas and Lee, Joonho and Cadena, Cesar and Hutter, Marco},
  journal={IEEE Robotics and Automation Letters}, 
  title={Seeing Through the Grass: Semantic Pointcloud Filter for Support Surface Learning}, 
  year={2023},
  volume={8},
  number={11},
  pages={7687-7694},
  doi={10.1109/LRA.2023.3320016}
 }