Skip to content

PyTorch implementation of "Continual Detection Transformer for Incremental Object Detection" (CVPR 2023)

License

Notifications You must be signed in to change notification settings

yaoyao-liu/CL-DETR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

73 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Continual Detection Transformer for Incremental Object Detection

LICENSE Python PyTorch

[Paper] [Project Page]

This repository contains the PyTorch implementation for the CVPR 2023 Paper "Continual Detection Transformer for Incremental Object Detection" by Yaoyao Liu, Bernt Schiele, Andrea Vedaldi, and Christian Rupprecht.

This is the preliminary code. If you have any questions on this repository or the related paper, feel free to create an issue or send me an email.

Installation and Datasets

This code is based on Deformable DETR. You may follow the instructions in https://github.com/fundamentalvision/Deformable-DETR to install packages and prepare datasets for this project.

Requirements

  • Linux, CUDA>=9.2, GCC>=5.4

  • Python>=3.7

    We recommend you to use Anaconda to create a conda environment:

    conda create -n cl_detr python=3.7 pip

    Then, activate the environment:

    conda activate cl_detr
  • PyTorch>=1.5.1, torchvision>=0.6.1 (following instructions here)

    For example, if your CUDA version is 9.2, you could install pytorch and torchvision as following:

    conda install pytorch=1.5.1 torchvision=0.6.1 cudatoolkit=9.2 -c pytorch
  • Other requirements

    pip install -r requirements.txt

Compiling CUDA Operators

cd ./models/ops
sh ./make.sh
# unit test (should see all checking is True)
python test.py

Dataset Preparation

Please download COCO 2017 dataset and organize them as following:

code_root/
└── data/
    └── coco/
        ├── train2017/
        ├── val2017/
        └── annotations/
        	├── instances_train2017.json
        	└── instances_val2017.json

Performance

Incremental object detection results (%) on COCO 2017. In the A+B setup, in the first phase, we observe a fraction $\frac{A}{A+B}$ of the training samples with A categories annotated. Then, in the second phase, we observe the remaining $\frac{B}{A+B}$ of the training samples, where B new categories are annotated.

Setting Detection Baseline $AP$ ${AP}_{50}$ ${AP}_{75}$ ${AP}_{S}$ ${AP}_{M}$ ${AP}_{L}$
70+10 Deformable DETR 40.1 57.8 43.7 23.2 43.2 52.1
40+40 Deformable DETR 37.5 55.1 40.3 20.9 40.8 50.7

Checkpoints

You may download the checkpoints here: [link]. The experiment setting is COCO 2017, 70+10. Please put the phase-0 checkpoint, phase_0.pth, in the base directory before running the code. The current version will automatically load the phase-0 checkpoint to speed up the experiments. This is because phase 0 is not an incremental learning phase. It is the same as the standard Deformable DETR.

Running Experiments

Run the following script to start the experiment for COCO 2017, 70+10:

bash run.sh

If you need to run experiments for the 40+40 setting, you may need to change the code in multiple files, e.g., main.py and datasets/pycocotools.py. Please refer to this branch for the 40+40 experiments: https://github.com/yaoyao-liu/CL-DETR/tree/40_40

Citation

Please cite our paper if it is helpful to your work:

@inproceedings{Liu2023CLDETR,
  author       = {Yaoyao Liu and
                  Bernt Schiele and
                  Andrea Vedaldi and
                  Christian Rupprecht},
  title        = {Continual Detection Transformer for Incremental Object Detection},
  booktitle    = {{IEEE/CVF} Conference on Computer Vision and Pattern Recognition,
                  {CVPR} 2023, Vancouver, BC, Canada, June 17-24, 2023},
  pages        = {23799--23808},
  publisher    = {{IEEE}},
  year         = {2023}
}

Acknowledgement

Our implementation uses the source code from the following repository:

About

PyTorch implementation of "Continual Detection Transformer for Incremental Object Detection" (CVPR 2023)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published