Scheduler #362
-
How can I use a different scheduler like ReduceLROnPlateau? |
Beta Was this translation helpful? Give feedback.
Answered by
yoshitomo-matsubara
May 29, 2023
Replies: 1 comment 3 replies
-
Hi @Adesh696 , For torchdistill <= 0.33, you can use built-in PyTorch schedulers simply by specifying the class name and kwargs like this e.g, MultiStepLR for final training densenet_bc_k12 scheduler:
type: 'MultiStepLR'
params:
milestones: [150, 225]
gamma: 0.1 If you want to use ReduceLROnPlateau, then you will need to replace the scheduler entry with scheduler:
type: 'ReduceLROnPlateau'
params:
mode: 'min'
factor: 0.1
# ... other kwargs for instantiating ReduceLROnPlateau (except optimizer) e.g., threshold, threshold_mode, cooldown, min_lr, eps, verbose, if necessary
|
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
@Adesh696
You need to change this line and add
metrics=your_metrics
as part oftraining_box.post_process()
e.g.,
training_box.post_process(metrics=-val_top1_accuracy)
Close this discussion and "Mark as answer" if it resolves the issue