Skip to content

Latest commit

 

History

History
47 lines (36 loc) · 1.68 KB

README.md

File metadata and controls

47 lines (36 loc) · 1.68 KB

AdaSim

[arXiv]

This repo contains the Pytorch implementation of our ICCV 2023 paper:

Adaptive Similarity Bootstrapping for Self-Distillation based Representation Learning

Tim Lebailly*, Thomas Stegmüller*, Behzad Bozorgtabar, Tinne Tuytelaars, and Jean-Philippe Thiran.

alt text

Dependencies

Our code only has a few dependencies. First, install PyTorch for your machine following https://pytorch.org/get-started/locally/. Then, install other needed dependencies:

pip install einops

Pretraining

Single GPU pretraining

Run the main_adasim.py file. Command line args are defined in parser.py.

python main_adasim.py --args1 val1

Make sure to use the right arguments specified in the table below!

1 node pretraining

python -m torch.distributed.launch --nproc_per_node=8 main_adasim.py --args1 val1

Citation

If you find our work useful, please consider citing:

@article{lebailly2023adaptive,
  title={Adaptive Similarity Bootstrapping for Self-Distillation},
  author={Lebailly, Tim and Stegm{\"u}ller, Thomas and Bozorgtabar, Behzad and Thiran, Jean-Philippe and Tuytelaars, Tinne},
  journal={arXiv preprint arXiv:2303.13606},
  year={2023}
}

Acknowledgments

This code is adapted from DINO.