Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tutorials fix with set_all_parameters #113

Open
wants to merge 3 commits into
base: main
Choose a base branch
from

Conversation

gmuraru
Copy link
Contributor

@gmuraru gmuraru commented May 12, 2020

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Docs change / refactoring / dependency upgrade

Motivation and Context / Related issue

Some tutorials cells were failing because set_all_parameters method does not exist for the modules from PyTorch.

Also, iterate aver all parameters in a more generic way and set the value using nn.init.constant_ then using the data_ attribute. (thanks @youben11 and @LaRiffle)

How Has This Been Tested (if it applies)

The tests are passing (WIP)

Checklist

  • The documentation is up-to-date with the changes I made.
  • I have read the CONTRIBUTING document and completed the CLA (see CONTRIBUTING).
  • All tests passed, and additional code has been covered with new tests.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label May 12, 2020
self.fc2.bias.data.fill_(value)
self.fc3.weight.data.fill_(value)
self.fc3.bias.data.fill_(value)
for p in self.parameters():
Copy link
Contributor Author

@gmuraru gmuraru May 12, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Q: Is there any reason why data.fill_ is used?

Copy link
Contributor

@knottb knottb May 12, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As far as I'm aware, torch.nn.init.constant_(param, value) is the same as param.data.fill_(value).

constant_ doc
fill_ doc

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They should do the same thing but changed it to constant_ because of an issue indicated in this PR

@@ -302,9 +298,8 @@ def forward(self, input):
out = self.nested(out)

def set_all_parameters(self, value):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Q: Is there any plan to have a method like this added directly in PyTorch for nn.Module?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't believe there is any plan to do this. This function isn't defined in CrypTen either - it is defined in a unit test as a helper function. In general, I think it is left to the user to initialize (or otherwise set) parameter values.

@knottb
Copy link
Contributor

knottb commented May 12, 2020

Could you comment on what exactly was failing in the tutorial? We haven't reproduced failures on our end.

@gmuraru
Copy link
Contributor Author

gmuraru commented May 13, 2020

Could you comment on what exactly was failing in the tutorial? We haven't reproduced failures on our end.

For Tutorial 4 it was failing at cell 4 -- when using the load function (I will update this PR to use the load from party function).

The stack trace is the following:

ModuleAttributeError                      Traceback (most recent call last)
<ipython-input-6-fc7a8d678499> in <module>
      1 # Load pre-trained model to Alice
      2 dummy_model = AliceNet()
----> 3 plaintext_model = crypten.load('models/tutorial4_alice_model.pth', dummy_model=dummy_model, src=ALICE)
      4 
      5 # Encrypt the model from Alice:

.../crypten/__init__.py in load(f, preloaded, encrypted, dummy_model, src, load_closure, **kwargs)
    312     )
    313     return load_from_party(
--> 314         f, preloaded, encrypted, dummy_model, src, load_closure, **kwargs
    315     )
    316 

.../crypten/__init__.py in load_from_party(f, preloaded, encrypted, dummy_model, src, load_closure, **kwargs)
    270             elif isinstance(result, torch.nn.Module):
    271                 result_zeros = copy.deepcopy(result)
--> 272                 result_zeros.set_all_parameters(0)
    273             else:
    274                 result = comm.get().broadcast_obj(-1, src)

.../torch/nn/modules/module.py in __getattr__(self, name)
    600                 return modules[name]
    601         raise ModuleAttributeError("'{}' object has no attribute '{}'".format(
--> 602             type(self).__name__, name))
    603 
    604     def __setattr__(self, name, value):

ModuleAttributeError: 'AliceNet' object has no attribute 'set_all_parameters'

@gmuraru
Copy link
Contributor Author

gmuraru commented May 14, 2020

Maybe this is failing only on the public repo? (thank @LaRiffle)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants