This repository is the implementation of "SEM: Switchable Excitation Module for Self-attention Mechanism" [paper] on CIFAR-100 and CIFAR-10 datasets.
SEM is a self-attention module, which can automatically decide to select and integrate attention operators to compute attention maps.
Python and PyTorch.
pip install -r requirements.txt
python run.py --dataset cifar100 --block-name bottleneck --depth 164 --epochs 164 --schedule 81 122 --gamma 0.1 --wd 1e-4
Dataset | original | SEM | |
---|---|---|---|
ResNet164 | CIFAR10 | 93.39 | 94.95 |
ResNet164 | CIFAR100 | 74.30 | 76.76 |
@article{zhong2022switchable,
title={Switchable Self-attention Module},
author={Zhong, Shanshan and Wen, Wushao and Qin, Jinghui},
journal={arXiv preprint arXiv:2209.05680},
year={2022}
}
Many thanks to bearpaw for his simple and clean Pytorch framework for image classification task.