-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Set diff params groups #55
base: magma
Are you sure you want to change the base?
Conversation
Looks good overall, I was wondering if we can make it more general?
So the idea is to instead do this: finetune_group_lr_info = {key_word, annealing_lr_params} |
Could you also add a small dummy test case? |
As we discussed, we can make another PR for further enhancement |
Use deep.py wrapper to run this test file. |
Divide all the parameters into 4 params groups depending on w/wo weight_decay and being finetuend/pretrained.
Add two extra args
"finetune_keywords":list of string, parameter will be putted into fintune groups as long as its name contain one of these keywords, "image_prefix" is the only keyword for now.
"finetune_factor":float, control the learning rate of fintuned groups, whose real lr=pretrained_lr*finetune_factor
In this way, it leave enough room for further change, such as adding more modility encoder and also avoid adding too much hyperparameter to adjust the lr of finetune group