Skip to content
/ SIGUA Public

ICML'20: SIGUA: Forgetting May Make Learning with Noisy Labels More Robust

Notifications You must be signed in to change notification settings

bhanML/SIGUA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
Dec 14, 2020
Dec 12, 2020
Dec 12, 2020
Dec 12, 2020
Dec 12, 2020
Dec 12, 2020
Dec 12, 2020
Dec 12, 2020
Dec 12, 2020

Repository files navigation

SIGUA

ICML'20: SIGUA: Forgetting May Make Learning with Noisy Labels More Robust

======== This is the code for the paper: SIGUA: Forgetting May Make Learning with Noisy Labels More Robust

Presented at ICML'20 If you find this code useful in your research then please cite

@InProceedings{pmlr-v119-han20c,
  title = 	 {{SIGUA}: Forgetting May Make Learning with Noisy Labels More Robust},
  author =       {Han, Bo and Niu, Gang and Yu, Xingrui and Yao, Quanming and Xu, Miao and Tsang, Ivor and Sugiyama, Masashi},
  booktitle = 	 {International Conference on Machine Learning},
  pages = 	 {4006--4016},
  year = 	 {2020}
}

Setup

Istall Miniconda3, and then

conda create -f environment.yml 
conda activate pytorch1.0.0_py36

Running SIGUA on benchmark datasets (MNIST, CIFAR-10)

sh scripts/mnist_sigua_sl.sh
sh scripts/mnist_sigua_bc.sh
sh scripts/cifar10_sigua_sl.sh
sh scripts/cifar10_sigua_bc.sh

The other reproducible version

Please check the other reproducible code of SIGUA: https://github.com/yeachan-kr/pytorch-sigua

About

ICML'20: SIGUA: Forgetting May Make Learning with Noisy Labels More Robust

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published