Skip to content

Commit

Permalink
Backend pytorch: Support more lr schedulers (#1392)
Browse files Browse the repository at this point in the history
  • Loading branch information
supercgor authored Jul 21, 2023
1 parent 676ebb6 commit 411ca5c
Show file tree
Hide file tree
Showing 2 changed files with 16 additions and 0 deletions.
4 changes: 4 additions & 0 deletions deepxde/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,6 +93,10 @@ def compile(
- For backend PyTorch:
- `StepLR <https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.StepLR.html>`_: ("step", step_size, gamma)
- `CosineAnnealingLR <https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.CosineAnnealingLR.html>`_: ("cosine", T_max, eta_min)
- `InverseTimeLR <https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/schedules/InverseTimeDecay>`_: ("inverse time", decay_steps, decay_rate)
- `ExponentialLR <https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.ExponentialLR.html>`_: ("exponential", gamma)
- `LambdaLR <https://pytorch.org/docs/stable/generated/torch.optim.lr_scheduler.LambdaLR.html>`_: ("lambda", lambda_fn: Callable[[step], float])
- For backend PaddlePaddle:
Expand Down
12 changes: 12 additions & 0 deletions deepxde/optimizers/pytorch/optimizers.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,18 @@ def _get_learningrate_scheduler(optim, decay):
return torch.optim.lr_scheduler.StepLR(
optim, step_size=decay[1], gamma=decay[2]
)
elif decay[0] == "cosine":
return torch.optim.lr_scheduler.CosineAnnealingLR(
optim, decay[1], eta_min=decay[2]
)
elif decay[0] == "inverse time":
return torch.optim.lr_scheduler.LambdaLR(
optim, lambda step: 1 / (1 + decay[2] * (step / decay[1]))
)
elif decay[0] == "exponential":
return torch.optim.lr_scheduler.ExponentialLR(optim, decay[1])
elif decay[0] == "lambda":
return torch.optim.lr_scheduler.LambdaLR(optim, decay[1])

# TODO: More learning rate scheduler
raise NotImplementedError(
Expand Down

0 comments on commit 411ca5c

Please sign in to comment.