Skip to content
This repository has been archived by the owner on Jul 1, 2024. It is now read-only.

introduce lr_scale #568

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

stephenyan1231
Copy link
Contributor

Summary:
For model fine-tuning, we want to use small LR for pre-trained trunk, and scale up LR for head modules which are randomly initialized.
To do this, users of optimizer can specify a new argument lr_scale which is used to scale up basic LR.

Differential Revision: D22618966

Differential Revision: D21949491

fbshipit-source-id: 9ece9ce66a1299cfc583a226ea1d42ad6bc77a2d
Differential Revision: D22372256

fbshipit-source-id: 47d3bee6252d929593948f52484f0701e3d32194
Summary:
For model fine-tuning, we want to use small LR for pre-trained trunk, and scale up LR for head modules which are randomly initialized.
To do this, users of optimizer can specify a new argument `lr_scale` which is used to scale up basic LR.

Differential Revision: D22618966

fbshipit-source-id: 34c1a1730e3688eabcadc1158a81404de2f1b2f8
@facebook-github-bot facebook-github-bot added CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported labels Jul 18, 2020
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D22618966

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants