Skip to content

Limexcyan/ibp

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

HyperMask: Adaptive Hypernetwork-based Masks for Continual Learning

Generate a semi-binary mask for a target network using a hypernetwork.

Scheme of HyperMask method

Use environment.yml file to create a conda environment with necessary libraries. One of the most essential packages is hypnettorch which should easy create hypernetworks in PyTorch.

DATASETS

The implemented experiments uses four publicly available datasets for continual learning tasks: Permuted MNIST, Split MNIST, Split CIFAR-100 and Tiny ImageNet. The datasets may be downloaded when the algorithm runs.

USAGE

The description of HyperMask is included in the paper. To perform experiments with the use of the best hyperparameters found and reproduce the results from the publication for five different seed values, one should run main.py file with the variable create_grid_search set to False and the variable dataset set to PermutedMNIST, SplitMNIST, CIFAR100 or TinyImageNet. In the third and fourth cases, as a target network ResNet-20 or ZenkeNet can be selected. To train ResNets, it is necessary to set part = 0, while to prepare ZenkeNets, one has to set part = 1. In the remaining cases, the variable part is insignificant.

Also, to prepare experiments with CIFAR100 according to the FeCAM scenario, one should set the variable dataset in main.py to CIFAR100_FeCAM_setup with part = 6 to run training with a ResNet model or part = 7 to train a ZenkeNet model.

One can also easily perform hyperparameter optimization using a grid search technique. For this purpose, one should set the variable create_grid_search to True in main.py file and modify lists with hyperparameters for the selected dataset in datasets.py file.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published