Skip to content

Latest commit

 

History

History
66 lines (55 loc) · 2.36 KB

README.md

File metadata and controls

66 lines (55 loc) · 2.36 KB

DART: Diversify-Aggregate-Repeat Training

This repository contains codes for the training and evaluation of our CVPR-23 paper DART:Diversify-Aggregate-Repeat Training Improves Generalization of Neural Networks main and supplementary. The arxiv link for the paper is also available.

Environment Settings

  • Python 3.6.9
  • PyTorch 1.8
  • Torchvision 0.8.0
  • Numpy 1.19.2

Training

For training DART on Domain Generalization task:

python train_all.py [name_of_exp] --data_dir ./path/to/data --algorithm ERM --dataset PACS --inter_freq 1000 --steps 10001

Combine with SWAD

set swad: True in config.yaml file or pass --swad True in the python command.

Changing Model & Hyperparams

Similarly, to change the model (eg- VIT), swad hyperparameters or MIRO hyperparams, you can update config.yaml file or pass it as argument in the python command.

python train_all.py [name_of_exp] --data_dir ./path/to/data \
    --lr 3e-5 \
    --inter_freq 600 \
    --steps 8001 \
    --dataset OfficeHome \
    --algorithm MIRO \
    --ld 0.1 \
    --weight_decay 1e-6 \
    --swad True \
    --model clip_vit-b16

Results

In-Domain Generalization of DART:

Domain Generalization of DART:

Combining DART with other DG methods on Office-Home:

Citing this work

@inproceedings{jain2023dart,
  title={DART: Diversify-Aggregate-Repeat Training Improves Generalization of Neural Networks},
  author={Jain, Samyak and Addepalli, Sravanti and Sahu, Pawan Kumar and Dey, Priyam and Babu, R Venkatesh},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={16048--16059},
  year={2023}
}