Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

nn.batch_norm #17

Open
xinmei9322 opened this issue Mar 30, 2017 · 0 comments
Open

nn.batch_norm #17

xinmei9322 opened this issue Mar 30, 2017 · 0 comments

Comments

@xinmei9322
Copy link

In train_mnist_feature_matching.py and other similar files, the generator uses nn.batch_norm, but it seems your implementation self.bn_updates = [(self.avg_batch_mean, new_m), (self.avg_batch_var, new_v)] is not updated in the train_mnist_feature_matching.py. (I saw the init_updates really got updated in the file.) I print the avg_batch_mean values, they are always 0 and the avg_batch_var values are always 1.
That means the batch normalization in your code does not normalize inputs when testing. Is it a bug? Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant