For details, please address "TransXNet: Learning Both Global and Local Dynamics with a Dual Dynamic Token Mixer for Visual Recognition".
# Environments:
cuda==11.3
python==3.8.15
# Packages:
mmcv==1.7.1
mmdet==2.28.2
timm==0.6.12
torch==1.12.1
torchvision==0.13.1
Prepare COCO 2017 according to the guidelines.
Method | Backbone | Pretrain | Lr schd | Aug | box AP | mask AP | Config | Download |
---|---|---|---|---|---|---|---|---|
RetinaNet | TransXNet-T | ImageNet-1K | 1x | No | 43.1 | - | config | log & model |
RetinaNet | TransXNet-S | ImageNet-1K | 1x | No | 46.4 | - | config | log & model |
RetinaNet | TransXNet-B | ImageNet-1K | 1x | No | 47.6 | - | config | log & model |
Mask R-CNN | TransXNet-T | ImageNet-1K | 1x | No | 44.5 | 40.7 | config | log & model |
Mask R-CNN | TransXNet-S | ImageNet-1K | 1x | No | 47.7 | 43.1 | config | log & model |
Mask R-CNN | TransXNet-B | ImageNet-1K | 1x | No | 48.8 | 43.8 | config | log & model |
To train TransXNet-T + RetinaNet
models on COCO train2017 with 8 gpus (single node), run:
bash dist_train.sh configs/retinanet_transx_t_fpn_1x_coco.py 8
To train TransXNet-T + Mask R-CNN
models on COCO train2017 with 8 gpus (single node), run:
bash dist_train.sh configs/mask_rcnn_transx_t_fpn_1x_coco.py 8
To evaluate TransXNet-T + RetinaNet
models on COCO val2017, run:
bash dist_test.sh configs/retinanet_transx_t_fpn_1x_coco.py /path/to/checkpoint_file 8 --out results.pkl --eval bbox
To evaluate TransXNet-T + Mask R-CNN
models on COCO val2017, run:
bash dist_test.sh configs/mask_rcnn_transx_t_fpn_1x_coco.py /path/to/checkpoint_file 8 --out results.pkl --eval bbox segm
If you find this project useful for your research, please consider citing:
@article{lou2023transxnet,
title={TransXNet: Learning Both Global and Local Dynamics with a Dual Dynamic Token Mixer for Visual Recognition},
author={Lou, Meng and Zhou, Hong-Yu and Yang, Sibei and Yu, Yizhou},
journal={arXiv preprint arXiv:2310.19380},
year={2023}
}
If you have any questions, please feel free to create issues or contact me at [email protected].