- Clone using:
git clone --recurse-submodules https://github.com/TomasGadea/MLP-NAS.git
cd MLP-NAS
- Create python3.9.13 environ (python3 version of scar):
python3 -m venv environ
source environ/bin/activate
pip install -r requirements.txt
- Ask @TomasGadea for config files:
config.json
and add your wandb API key (this last thing is optional)
(Open a tmux
session rooted in MLP-NAS
)
sh execute.sh
To see all available params check main.py
.
--use-amp
storesTrue
when added and usestorch.cuda.amp.autocast
andtorch.cuda.amp.GradScaler
.--wandb
storesTrue
and logs into your wandb account using your API (optional).
(Open a tmux
session rooted in MLP-NAS
)
sh fixed_execute.sh
To see all available params check fixed_main.py
.
--path-to-supernet
is the output path of any past experiment ofexecute.sh
. Check the example infixed_execute.sh
.
Output files for Train Search are:
flops_table.txt
: string formatted table of n_params and flops of model.log.csv
: metrics such as acc, F, mmc, along epochs.params.json
: parameters that include all arguments called inexecute.sh
and other extra info.W.pt
: Last version of the model saved in PyTorch format after all training epochs.W_test.pt
: Best version of the model saved in PyTorch format after all training epochs.
Output files for Retrain Fixed are:
flops_table.txt
: string formatted table of n_params and flops of model.log.csv
: metrics such as acc, mmc, along epochs.params.json
: parameters that include all arguments called infixed_execute.sh
and other extra info.W.pt
: Last version of the model saved in PyTorch format after all training epochs.
Retrain Fixed files are stored in out/retrain/
dir, unlike Train Search that are directly into out/
directoy. They can be modified however using the --output
arg if desired.