Skip to content

Latest commit

 

History

History
14 lines (12 loc) · 730 Bytes

v3.3.4.md

File metadata and controls

14 lines (12 loc) · 730 Bytes

Change Log

Feature

  • Support OrthoGrad feature for create_optimizer(). (#324)
  • Enhanced flexibility for the optimizer parameter in Lookahead, TRAC, and OrthoGrad optimizers. (#324)
    • Now supports both torch.optim.Optimizer instances and classes
    • You can now use Lookahead optimizer in two ways.
      • Lookahead(AdamW(model.parameters(), lr=1e-3), k=5, alpha=0.5)
      • Lookahead(AdamW, k=5, alpha=0.5, params=model.parameters())
  • Implement SPAM optimizer. (#324)
  • Implement TAM, and AdaTAM optimizers. (#325)