You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I found that CuDNNBatchNormLayer in your caffe branch ( https://github.com/Tongcheng/caffe ) is not registered as a Creator in layer_factory.cpp, But your BatchNorm Registered with REGISTER_LAYER_CLASS(BatchNorm);
The text was updated successfully, but these errors were encountered:
WenzhMicrosoft
changed the title
BatchNorm with CUDNN doesn't work, so there are no scale factors in BatchNorm layers
BatchNorm with CUDNN doesn't work, so there are no scale factors in BatchNorm layers?
Jul 27, 2017
Hello @WenzhMicrosoft , I am not sure what is the question we are having, but my understanding is that during layer initialization, it only reads the configuration from NeuralNetwork's proto based on the caffe.proto. Therefore I think layer_factory should not matter.
Hi @Tongcheng, You specified "engine: CUDNN" in prototxt, I think the reason you specified this is because you want to use a CuDNNBatchNormLayer(Do you?), and it makes more sense if you use CuDNNBatchNormLayer because CuDNNBatchNormLayer actually is combination of BatchNormLayer and ScaleLayer, normally Caffe version of Neural Network need this ScaleLayer layer to implement the γ and β factor metioned in Batch normalization paper
As what I said before. Caffe actually creates a BatchNormLayer without γ and β. Is this what you want?
WenzhMicrosoft
changed the title
BatchNorm with CUDNN doesn't work, so there are no scale factors in BatchNorm layers?
BatchNorm with CUDNN doesn't take effect, so there are no scale factors in BatchNorm layers?
Jul 27, 2017
I found that CuDNNBatchNormLayer in your caffe branch ( https://github.com/Tongcheng/caffe ) is not registered as a Creator in layer_factory.cpp, But your BatchNorm Registered with REGISTER_LAYER_CLASS(BatchNorm);
An registered CuDNNBatchNormLayer seems like this:
BVLC/caffe@c9eda39#diff-6fe0622356ab61c001bcac36dd571e7d
So I guess the following setting wouldn't take affect, and there is no scale factors, and we need to add scale layers after each "BatchNorm" layer
The text was updated successfully, but these errors were encountered: