Skip to content

Commit

Permalink
Add adabelief to the readme. (#210)
Browse files Browse the repository at this point in the history
  • Loading branch information
jettify authored Oct 20, 2020
1 parent 9c72aa0 commit efeea8f
Show file tree
Hide file tree
Showing 4 changed files with 36 additions and 0 deletions.
35 changes: 35 additions & 0 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,9 @@ Supported Optimizers
| `AccSGD`_ | https://arxiv.org/abs/1803.05591 |
+-------------+-------------------------------------------------------------------------------+
| | |
| `AdaBelief`_| https://arxiv.org/abs/2010.07468 |
+-------------+-------------------------------------------------------------------------------+
| | |
| `AdaBound`_ | https://arxiv.org/abs/1902.09843 |
+-------------+-------------------------------------------------------------------------------+
| | |
Expand Down Expand Up @@ -261,6 +264,38 @@ AccSGD

**Reference Code**: https://github.com/rahulkidambi/AccSGD


AdaBelief
---------

+-------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------+
| .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rastrigin_AdaBelief.png | .. image:: https://raw.githubusercontent.com/jettify/pytorch-optimizer/master/docs/rosenbrock_AdaBelief.png |
+-------------------------------------------------------------------------------------------------------------+--------------------------------------------------------------------------------------------------------------+

.. code:: python
import torch_optimizer as optim
# model = ...
optimizer = optim.AdaBelief(
m.parameters(),
lr= 1e-3,
betas=(0.9, 0.999),
eps=1e-3,
weight_decay=0,
amsgrad=False,
weight_decouple=False,
fixed_decay=False,
rectify=False,
)
optimizer.step()
**Paper**: *AdaBelief Optimizer, adapting stepsizes by the belief in observed gradients* (2020) [https://arxiv.org/abs/2010.07468]

**Reference Code**: https://github.com/juntang-zhuang/Adabelief-Optimizer


AdaBound
--------

Expand Down
Binary file added docs/rastrigin_AdaBelief.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/rosenbrock_AdaBelief.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1 change: 1 addition & 0 deletions examples/viz_optimizers.py
Original file line number Diff line number Diff line change
Expand Up @@ -191,6 +191,7 @@ def LookaheadYogi(*a, **kw):
(optim.A2GradUni, -8, 0.1),
(optim.A2GradInc, -8, 0.1),
(optim.A2GradExp, -8, 0.1),
(optim.AdaBelief, -8, 0.1),
]
execute_experiments(
optimizers,
Expand Down

0 comments on commit efeea8f

Please sign in to comment.