You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 8, 2021. It is now read-only.
Hi, I found an error when I use the excitation-backprop method to attribute my Resnet.
The program raises an error when backprop by the excitation-backprop rule when it comes to the pooling layer.
The error is 'diff_view_meta->creation_meta == CreationMeta::DEFAULT INTERNAL ASSERT FAILED at /opt/conda/conda-bld/pytorch_1591914838379/work/torch/csrc/autograd/variable.cpp:82, please report a bug to PyTorch.'
I found this is because the _patch_pool's backward function contains the in-place multiplication, which has been suggested by PyTorch not to use with autograd.
Hi, I found an error when I use the excitation-backprop method to attribute my Resnet.
The program raises an error when backprop by the excitation-backprop rule when it comes to the pooling layer.
The error is 'diff_view_meta->creation_meta == CreationMeta::DEFAULT INTERNAL ASSERT FAILED at /opt/conda/conda-bld/pytorch_1591914838379/work/torch/csrc/autograd/variable.cpp:82, please report a bug to PyTorch.'
I found this is because the _patch_pool's backward function contains the in-place multiplication, which has been suggested by PyTorch not to use with autograd.
TorchRay/torchray/attribution/excitation_backprop.py
Line 401 in 6a198ee
I changed the in-place operation to out-of-place, and the error disappeared.
The text was updated successfully, but these errors were encountered: