You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The code runs properly and model gets quantized. The log file looks like this -
[VAIQ_NOTE][QUANTIZER_TORCH_NOT_FUSED_BN]: Node Classifier::Classifier/ResNet[resnet]/BatchNorm1d[fc]/9867 cannot be fused into CONV layers, this is not quantization friendly. It is recommended to adjsut the pattern to CONV+BN.
[VAIQ_NOTE][QUANTIZER_TORCH_NOT_FUSED_BN]: Node Classifier::Classifier/BatchNorm1d[batchNorm2]/input.311 cannot be fused into CONV layers, this is not quantization friendly. It is recommended to adjsut the pattern to CONV+BN.
[VAIQ_NOTE][QUANTIZER_TORCH_NOT_FUSED_BN]: Node Classifier::Classifier/BatchNorm1d[batchNorm3]/input.317 cannot be fused into CONV layers, this is not quantization friendly. It is recommended to adjsut the pattern to CONV+BN.
[VAIQ_NOTE]: =>Doing weights equalization...
[VAIQ_NOTE]: =>Quantizable module is generated.(./TCGA2/MODIFIED/ResNet50/Fold0/quant/Classifier.py)
[VAIQ_NOTE]: =>Get module with quantization.
[122, 134, 136]
non quantized model test_accuracy = 100.0 | test_loss = 0.005811731649406107
quantized model test_accuracy = 99.4898 | test_loss = 0.024748119462491072
[VAIQ_NOTE]: =>Converting to xmodel ...
[VAIQ_NOTE]: =>Successfully convert 'Classifier' to xmodel.(./TCGA2/MODIFIED/ResNet50/Fold0/quant/Classifier_int.xmodel)
My question is, here it says batch normalization layer cannot be used with conv layer and adjust the CONV+BN pattern. So what is this correct pattern ?
The text was updated successfully, but these errors were encountered:
I have a CNN in built in pytorch
I quantize this model
check test accuracy of both the quantized and non quantized model and then export the xmodel using following command
The code runs properly and model gets quantized. The log file looks like this -
My question is, here it says batch normalization layer cannot be used with conv layer and adjust the CONV+BN pattern. So what is this correct pattern ?
The text was updated successfully, but these errors were encountered: