You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thanks for your research.
I took the liberty to fix some issues I've found in the code. I published it here.
In particular, on Block the mlp_ratio is being passed as float. Usually the MLP uses hidden_dim = int(dim * mlp_ratio) but in the official CFSR released code it uses those variables directly, causing the dtype issue. Another issue I've found was using FloatTensor for sobel and laplacian tensors. Instead, the correct solution would be to use .clone() instead (no need, just pass it directly). I've also added a bool for mean normalization, as the default uses values from ImageNet, which tends to cause issues on other datasets.
Again, thank you for releasing the code for this research, the paper is very interesting 👍
The text was updated successfully, but these errors were encountered:
I wanted to add that I'm not a professional in this field, so I'm not entirely certain about the changes I've suggested. However, based on my understanding, it seems to me that the modification might be beneficial.
Hi, thanks for your research.
I took the liberty to fix some issues I've found in the code. I published it here.
In particular, on
Block
the mlp_ratio is being passed as float. Usually the MLP useshidden_dim = int(dim * mlp_ratio)
but in the official CFSR released code it uses those variables directly, causing the dtype issue. Another issue I've found was usingFloatTensor
for sobel and laplacian tensors. Instead, the correct solution would be touse(no need, just pass it directly). I've also added a bool for mean normalization, as the default uses values from ImageNet, which tends to cause issues on other datasets..clone()
insteadAgain, thank you for releasing the code for this research, the paper is very interesting 👍
The text was updated successfully, but these errors were encountered: