-
Notifications
You must be signed in to change notification settings - Fork 283
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tutorials fix with set_all_parameters #113
base: main
Are you sure you want to change the base?
Conversation
self.fc2.bias.data.fill_(value) | ||
self.fc3.weight.data.fill_(value) | ||
self.fc3.bias.data.fill_(value) | ||
for p in self.parameters(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Q: Is there any reason why data.fill_
is used?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As far as I'm aware, torch.nn.init.constant_(param, value)
is the same as param.data.fill_(value)
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They should do the same thing but changed it to constant_
because of an issue indicated in this PR
@@ -302,9 +298,8 @@ def forward(self, input): | |||
out = self.nested(out) | |||
|
|||
def set_all_parameters(self, value): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Q: Is there any plan to have a method like this added directly in PyTorch for nn.Module
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't believe there is any plan to do this. This function isn't defined in CrypTen either - it is defined in a unit test as a helper function. In general, I think it is left to the user to initialize (or otherwise set) parameter values.
Could you comment on what exactly was failing in the tutorial? We haven't reproduced failures on our end. |
For Tutorial 4 it was failing at cell 4 -- when using the The stack trace is the following:
|
Maybe this is failing only on the public repo? (thank @LaRiffle) |
Types of changes
Motivation and Context / Related issue
Some tutorials cells were failing because
set_all_parameters
method does not exist for the modules from PyTorch.Also, iterate aver all parameters in a more generic way and set the value using
nn.init.constant_
then using thedata_
attribute. (thanks @youben11 and @LaRiffle)How Has This Been Tested (if it applies)
The tests are passing (WIP)
Checklist