You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In train_mnist_feature_matching.py and other similar files, the generator uses nn.batch_norm, but it seems your implementation self.bn_updates = [(self.avg_batch_mean, new_m), (self.avg_batch_var, new_v)] is not updated in the train_mnist_feature_matching.py. (I saw the init_updates really got updated in the file.) I print the avg_batch_mean values, they are always 0 and the avg_batch_var values are always 1.
That means the batch normalization in your code does not normalize inputs when testing. Is it a bug? Thanks.
The text was updated successfully, but these errors were encountered:
In train_mnist_feature_matching.py and other similar files, the generator uses nn.batch_norm, but it seems your implementation
self.bn_updates = [(self.avg_batch_mean, new_m), (self.avg_batch_var, new_v)]
is not updated in the train_mnist_feature_matching.py. (I saw the init_updates really got updated in the file.) I print the avg_batch_mean values, they are always 0 and the avg_batch_var values are always 1.That means the batch normalization in your code does not normalize inputs when testing. Is it a bug? Thanks.
The text was updated successfully, but these errors were encountered: