- Support
OrthoGrad
feature forcreate_optimizer()
. (#324) - Enhanced flexibility for the
optimizer
parameter inLookahead
,TRAC
, andOrthoGrad
optimizers. (#324)- Now supports both torch.optim.Optimizer instances and classes
- You can now use
Lookahead
optimizer in two ways.Lookahead(AdamW(model.parameters(), lr=1e-3), k=5, alpha=0.5)
Lookahead(AdamW, k=5, alpha=0.5, params=model.parameters())
- Implement
SPAM
optimizer. (#324) - Implement
TAM
, andAdaTAM
optimizers. (#325)